MCP Server Foundation Template
A customizable, production-ready foundation template for building Model Context Protocol (MCP) servers. This template follows MCP best practices and provides a clean, well-structured starting point for creating your own MCP servers.
π Features
Dual Transport Support: Both stdio (CLI) and HTTP (SSE) transport modes
Comprehensive Structure: Clear separation of tools, resources, and prompts
TypeScript: Full type safety with modern TypeScript
FastMCP: Built on the FastMCP framework for simplicity and performance
Docker Ready: Complete Docker and docker-compose support
Well Documented: Extensive documentation for usage, customization, and architecture
Extensible: Easy to add custom tools, resources, and prompts
Production Ready: Includes error handling, graceful shutdown, and best practices
π Table of Contents
π Quick Start
Prerequisites
Node.js 20+ or Bun 1.0+
Python 3 (optional, for Python tools)
Docker (optional, for containerized deployment)
Installation
Clone and setup:
π» Usage
Native Setup
stdio Transport (CLI Mode)
Start the server in stdio mode for command-line usage:
HTTP Transport (Web Mode)
Start the server in HTTP mode for web integration:
Docker Setup
Using Docker Compose
Development mode (with hot reload):
Production mode (optimized):
Default mode (both stdio + http):
Start specific service:
Using Docker Directly
π See
ποΈ Architecture
This template implements the Model Context Protocol (MCP) architecture:
Components:
Transport Layer: Handles communication (stdio or HTTP)
Data Layer: JSON-RPC 2.0 protocol
Server Core: FastMCP framework
Primitives: Tools, Resources, Prompts
See ARCHITECTURE.md for detailed architecture documentation.
π§ Customization
Adding Tools
Tools are functions that the AI can call to perform actions.
Node.js/TypeScript Tools
Create a new file src/tools/your_tool.ts:
Then register it in src/tools/index.ts:
Python Tools
For Python tools, you can:
Execute Python scripts: Use child_process to run Python scripts
Create a Python MCP proxy: Separate MCP server for Python tools
Use Python execution libraries: Use libraries like
python-shell
See src/tools/python.ts for implementation patterns.
Adding Resources
Resources are read-only data sources that the AI can access.
Create a resource in src/resources/your_resource.ts:
Register in src/resources/index.ts.
Adding Prompts
Prompts are template-based messages for the AI.
Create a prompt in src/prompts/your_prompt.ts:
Register in src/prompts/index.ts.
π Transport Modes
stdio Transport
Use Case: CLI tools, local development, Cursor integration
Communication: stdin/stdout
Network: None (local process communication)
Access: Single user, local only
Example: AI assistant in terminal
HTTP Transport
Use Case: Web apps, remote access, team sharing
Communication: Server-Sent Events (SSE)
Network: TCP/IP over HTTP
Access: Multi-user, remote capable
Example: Shared AI tools for team
βοΈ Configuration
Environment Variables
The server is configured using environment variables. Get started quickly:
Quick Reference
Server Settings:
TRANSPORT:stdioorhttp(default:stdio)PORT: HTTP port (default:3001)HOST: HTTP host binding (default:0.0.0.0)
Logging:
LOG_LEVEL:error,warn,info,debug(default:info)LOG_FORMAT:jsonortext(default:text)
Security:
API_KEY: API authentication key (optional)JWT_SECRET: JWT token secret (optional)ALLOWED_ORIGINS: Comma-separated CORS origins (optional)
Feature Flags:
ENABLE_TOOLS: Enable tools (default:true)ENABLE_RESOURCES: Enable resources (default:true)ENABLE_PROMPTS: Enable prompts (default:true)
Tool Execution:
PYTHON_PATH: Python executable path (default:python3)NODE_PATH: Node.js executable path (default:node)MAX_TOOL_EXECUTION_TIME: Max execution time in ms (default:30000)
Using Configuration in Code
Full Documentation
π See
Complete environment variable reference
Configuration best practices
Cloud deployment configuration
Example usage patterns
Troubleshooting guide
π οΈ Development
Development Mode
Auto-reload on file changes:
Scripts
npm start- Start in stdio modenpm run start:http- Start in HTTP modenpm run dev- Development mode with auto-reloadnpm run build- Build TypeScriptnpm run lint- Run ESLintnpm run type-check- Type checking without emit
Project Structure
See PLANNING.md for development planning and ARCHITECTURE.md for architecture details.
π§ͺ Testing
Local Testing
Test your MCP server with FastMCP CLI:
Integration Testing
Connect from Cursor:
Open Cursor Settings
Features β MCP Servers β Add new server
Configure:
stdio:
command: npm starthttp:
url: http://localhost:3001/sse
π’ Deployment
Docker Deployment
Production deployment:
Cloud deployment options:
Railway:
railway upRender: Configure via render.yaml
Fly.io:
fly launchKubernetes: Use k8s deployment manifests
π See
Cloud Deployment
Deploy to cloud platforms (AWS, GCP, Azure) using Docker or native binaries.
π Documentation
README.md: This file - getting started and usage
QUICK_START.md: Quick start guide
PLANNING.md: Development planning and task management
ARCHITECTURE.md: Detailed architecture documentation
TASK.md: Current tasks and progress
CONFIGURATION.md: Configuration guide
DOCKER.md: Complete Docker deployment guide
π€ Contributing
Contributions welcome! See the main project for contribution guidelines.
π License
MIT License - see LICENSE file for details
π Resources
π Acknowledgments
Built on FastMCP
Model Context Protocol by Anthropic
Template inspired by mcpdotdirect/template-mcp-server
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
A customizable, production-ready template for building Model Context Protocol servers with dual transport support (stdio and HTTP), TypeScript, Docker support, and extensible architecture for tools, resources, and prompts.