Template MCP Server
A Model Context Protocol (MCP) server template that provides a foundation for building MCP servers. This template can be customized for various data operations and management functionality.
1. Description
The Template MCP Server is a production-ready foundation for building Model Context Protocol (MCP) servers. It provides a complete framework with:
- FastAPI-based HTTP server with multiple transport protocol support
- Modular tool system for easy extension and customization
- Comprehensive testing and deployment configurations
- OpenShift deployment ready with SSL support
- Simplified architecture with everything organized as tools for maximum agent compatibility
The server supports multiple transport protocols (HTTP, SSE, Streamable-HTTP) and includes built-in tools for mathematical operations, code review prompts, and asset access.
Design Philosophy: This template focuses on MCP tools as the primary interface since they have universal support across all MCP clients including LangGraph, CrewAI, and others. This ensures maximum compatibility and ease of use.
2. Architecture
2.1 Flow Diagram
2.2 Code Structure
3. Quick Start - Create Your Own MCP Server
🚀 Want to create your own domain-specific MCP server from this template? Use our automated transformation script!
3.1 Automated Template Transformation
The fastest way to create your own MCP server is to use our transformation script. You have two options:
Option A: Download Script Only (Recommended)
Option B: Clone First, Then Transform
⚠️ Important: If you clone first, run the script BEFORE changing into the template directory:
3.2 What the Script Does
The transformation script automatically:
- ✅ Renames all files and directories with your project name
- ✅ Updates all code references from template to your domain
- ✅ Modifies configuration files (pyproject.toml, Containerfile, etc.)
- ✅ Updates documentation (README, deployment configs)
- ✅ Preserves all functionality (tests, tools, deployment configs)
- ✅ Creates a ready-to-use project in a new directory
3.3 After Transformation
⚠️ Critical: You MUST install the package after transformation for imports to work:
Why this step is required: The transformation script creates a new Python package with your project name (e.g., party_lens_mcp_server
), but Python needs the package to be installed to import it in tests and runtime.
Then start customizing by:
- Adding your domain-specific tools in
src/tools/
- Updating the example tools to match your use case
- Modifying tests for your new functionality
- Deploying using the included OpenShift configs
📚 For detailed transformation documentation, see scripts/README.md
4. Manual Installation (Alternative)
💡 Tip: If you just want to use this template to create your own MCP server, use the transformation script instead of manual installation.
Prerequisites
- Python 3.12 or higher
- uv (install from https://docs.astral.sh/uv/getting-started/installation/)
- Podman & Podman Desktop (MacOS) -- Need Podman Desktop to install "podman-mac-helper". The "podman-mac-helper" is a utility that facilitates Docker compatibility when using Podman on macOS. It allows users to run Docker commands and tools, like Maven or Testcontainers, without needing to reconfigure them to work with Podman. Essentially, it provides a bridge between the Podman engine and the Docker socket, enabling seamless integration
Install from source
5. Run the pytests
6. Environment File
Copy the contents of .env.template
to .env
using the command shown below:
Below is a snapshot of the .env.template file:
6. Security Considerations
⚠️ IMPORTANT: This server includes an OAuth2 compatibility mode (COMPATIBLE_WITH_CURSOR
) that significantly reduces security to accommodate certain clients like cursor.
6.1 Transport Protocol
The server supports multiple transport protocols that can be configured via the MCP_TRANSPORT_PROTOCOL
environment variable:
- http/streamable-http: Standard HTTP for request-response communication (both use the same implementation)
- sse: Server-Sent Events (SSE) for event-driven communication (deprecated)
Note: Both http and streamable-http protocols use the same HTTP implementation and are functionally identical. We recommend using http or streamable-http for most use cases as they provide the best compatibility and performance. The SSE protocol is deprecated and should only be used if specifically required for legacy clients like Goose users on Linux desktop environments.
7. Usage (Run locally)
Before running the server locally, make sure a PostgreSQL service is running and accessible. If you don't already have one running, you can start the included PostgreSQL container with:
Method 1: Using Python directly
Method 2: Using the installed script
Method 3: Using Podman Container
8. Server Endpoints
Once the server is running, it will be available at:
8.1 HTTP Protocol (http/streamable-http)
- MCP Server:
http://0.0.0.0:8080/mcp
- Health Check:
http://0.0.0.0:8080/health
8.2 SSE Protocol
- SSE Endpoint:
http://0.0.0.0:8080/sse
- Health Check:
http://0.0.0.0:8080/health
9. Deploy on OpenShift
The project includes complete OpenShift deployment configurations in the openshift/
directory:
Server Endpoints
Once the server is running, it will be available at:
HTTP Protocol (http/streamable-http)
- MCP Server:
https://template-mcp-server.apps.int.spoke.preprod.us-west-2.aws.paas.redhat.com/mcp
- Health Check:
https://template-mcp-server.apps.int.spoke.preprod.us-west-2.aws.paas.redhat.com/health
SSE Protocol
- SSE Endpoint:
https://template-mcp-server.apps.int.spoke.preprod.us-west-2.aws.paas.redhat.com/sse
- Health Check:
https://template-mcp-server.apps.int.spoke.preprod.us-west-2.aws.paas.redhat.com/health
OpenShift Configuration
- Namespace:
ddis-asteroid--template
- Port: 8443 (HTTPS)
- SSL: Configured with TLS certificates
- Resources: 1 CPU, 1Gi memory
- Health Checks: Liveness and readiness probes configured
10. Examples
Prerequisites
Run the exammples via another shell session in a venv You can create another venv by doing the following:
FastMCP Client Example
This example demonstrates:
- Connecting to the MCP server
- Using available tools (multiply_numbers, generate_code_review_prompt, get_redhat_logo)
- Mathematical operations and code analysis
- Asset retrieval functionality
LangGraph Client Example
Prerequisites:
- Template MCP server must be running on http://0.0.0.0:8080
- Google Generative AI credentials must be configured via GEMINI_API_KEY environment variable or GOOGLE_APPLICATION_CREDENTIALS environment variable
- All required Python packages must be installed
- Required dependencies: langchain-google-genai, langchain-mcp-adapters, langgraph
This example shows:
- LangGraph agent integration
- Google Gemini model usage
- Tool calls for mathematical operations
- Conversational AI workflows
10. How to Customize the Template
Adding New Tools
- Create a new tool file in
template_mcp_server/src/tools/
:
- Register the tool in
template_mcp_server/src/mcp.py
:
Adding Assets
If your tools need to access static files (images, data files, etc.), place them in the template_mcp_server/src/assets/
directory:
- Add your asset file to
template_mcp_server/src/assets/
: - Access the asset from your tool:
Updating Configuration
- Add new environment variables to
template_mcp_server/src/settings.py
:
- Update
.env.template
with your new variables:
Customizing the Server
- Update server behavior: Modify
template_mcp_server/src/mcp.py
- Add middleware: Update
template_mcp_server/src/api.py
- Customize logging: Modify
template_mcp_server/utils/pylogger.py
- Add authentication: Extend the FastAPI app in
template_mcp_server/src/api.py
Testing Your Changes
Container Testing
The project includes comprehensive container tests in tests/test_container.py
that verify:
- Rootless container build with Red Hat UBI Python 3.12
- Container execution and health verification
- SSL/HTTPS configuration capability
- Production deployment readiness
Container Features Tested:
- ✅ Red Hat UBI base image compliance
- ✅ Rootless operation (no root user required)
- ✅ Virtual environment isolation
- ✅ Red Hat certificate integration
- ✅ HTTP/HTTPS server startup
- ✅ Source code structure validation
- ✅ Podman build and execution validation
Requirements:
podman
must be available (Red Hat's container engine)- Network access for base image download
- ~2-3 minutes for initial build
Design Philosophy and Compatibility
This template follows a tools-first approach for maximum compatibility:
- ✅ Everything as Tools: All functionality (math operations, code review, asset access) is implemented as MCP tools
- ✅ Universal Client Support: Tools work with all MCP clients including LangGraph, CrewAI, and others
- ✅ Simplified Architecture: Single
tools/
directory contains all functionality - ✅ Easy Extension: Adding new capabilities is as simple as creating a new tool
Benefits of the Tools-First Approach:
- Maximum Compatibility: Works with any MCP client
- Consistent Interface: All functionality accessed through the same tool protocol
- Easy Testing: All features can be tested using the same patterns
- Future-Proof: As MCP evolves, tools remain the most stable interface
11. AI Development Assistant
This template includes .cursor/rules.md
- a comprehensive development guide specifically designed to help AI coding assistants understand and work effectively with this MCP server template.
What's Included
The .cursor/rules.md
file provides:
- Enterprise containerization patterns (Podman, Red Hat UBI, rootless containers)
- MCP development best practices (tool design, error handling, testing patterns)
- FastAPI + MCP integration examples with real code snippets
- Container testing strategies matching our
test_container.py
implementation - AI assistant guidelines for working with this specific template architecture
Usage Options
You have several options for the .cursor/rules.md
file:
- Keep it: Use as-is to help AI assistants understand your project structure
- Customize it: Modify the file to reflect your specific deployment needs and patterns
- Remove it: Delete the file if you don't need AI development assistance
- Contribute improvements: Submit merge requests with enhancements or fixes
Contributing
We welcome contributions to improve the AI development assistance:
- Bug fixes for incorrect patterns or outdated information
- New patterns for common MCP server development scenarios
- Documentation improvements for better AI assistant guidance
- Tool integration examples for additional development workflows
Submit your improvements via merge request - we value innovations in this area!
Deployment Considerations
- Update container configuration: Modify
Containerfile
(optimized for Podman/Buildah) - Update OpenShift configs: Modify files in
openshift/
directory - Update dependencies: Add new requirements to
pyproject.toml
- Test container changes: Run
pytest tests/test_container.py -v
- Update documentation: Modify this README to reflect your changes
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
A production-ready foundation template for building Model Context Protocol (MCP) servers with FastAPI, featuring modular tools, comprehensive testing, and OpenShift deployment configurations. Includes automated transformation scripts to create custom domain-specific MCP servers.
- 1. Description
- 2. Architecture
- 3. Quick Start - Create Your Own MCP Server
- 4. Manual Installation (Alternative)
- 5. Run the pytests
- 6. Environment File
- 6. Security Considerations
- 7. Usage (Run locally)
- 8. Server Endpoints
- 9. Deploy on OpenShift
- 10. Examples
- 10. How to Customize the Template
- 11. AI Development Assistant