Enables deployment of the MCP server via containerization, with support for different transport modes (stdio, sse, streamable-http) configured in the Dockerfile
Supports project hosting and code submission for integration with the OmniMCP platform
Supports project hosting and code submission for integration with the OmniMCP platform
🌐 What is MCP (Model Context Protocol)?
MCP (Model Context Protocol) is an open protocol that standardizes how applications provide context to large language models (LLMs). Think of MCP as the "USB-C port" for AI applications, offering a unified way to connect models to various data sources and tools.
Why MCP?
- A growing list of pre-built integrations that LLMs can directly plug into
- Flexibility to switch between different LLM providers and vendors
- Best practices for securing both local and remote data
General Architecture
MCP follows a client-server architecture, including:
- MCP Hosts: Programs like Claude Desktop, IDEs, or AI tools that want to access data through MCP
- MCP Clients: Protocol clients that maintain 1:1 connections with servers
- MCP Servers: Lightweight programs that expose specific capabilities through the standardized protocol
- Local Data Sources: Your computer's files, databases, and services
- Remote Services: External systems available over the internet (e.g., APIs)
Typical Use Cases
- Building agents and AI workflows that integrate multiple data sources and tools
- Allowing LLMs to securely access local or remote resources via a standard protocol
- Rapidly integrating and switching between different LLMs and their capabilities
For more details, see the official MCP documentation.
🚀 OmniMCP Platform: Unique Integration & API Conversion Features
⭐ Add Your MCP Project to the OmniMCP Platform (omnimcp.ai)
OmniMCP is not just a deployment platform—it is a showcase and integration hub for the entire MCP ecosystem!
If you are a developer or user and want to add your MCP project (or any MCP you are interested in) to the OmniMCP platform, please follow these steps:
Submission Process
- Prepare your project repository (on GitHub, Gitee, or any accessible code hosting platform).
- Go to https://omnimcp.ai and submit the repository link through the platform's add-project interface.
Requirements
- MCP Protocol Compliance:
- Your project must implement the MCP protocol.
- Stdio Mode Support:
- Your project must be able to start in
stdio
mode (support forsse
andstreamable-http
is planned for the future).
- Your project must be able to start in
- Dockerfile (Optional but Recommended):
- It is recommended to provide a
Dockerfile
for easy deployment. If you do not provide one, the platform will automatically generate a Dockerfile for deployment.
- It is recommended to provide a
By following these guidelines, your MCP project can be easily integrated and showcased on the OmniMCP platform, making it accessible to a wider audience.
🌟 [EXCLUSIVE] Instantly Convert Any API to an MCP Server on OmniMCP
OmniMCP offers a unique, industry-leading feature: Instantly convert any OpenAPI 3.0-compatible API into a fully functional MCP server—no code changes required!
How to Use This Feature
- Prepare an OpenAPI 3.0 Specification
- If your API already has an OpenAPI 3.0 (Swagger) document, you can use it directly. If not, generate one for your API.
- Example OpenAPI 3.0 JSON:
- Submit the OpenAPI Document Link
- Make your OpenAPI 3.0 document accessible via a public URL (e.g., GitHub raw link, web server, etc.).
- Go to https://omnimcp.ai, paste the link in the API-to-MCP submission form, and click submit.
- Automatic Conversion and Enhancement
- After submission, the OmniMCP platform will automatically convert your API into an MCP server.
- The platform will also analyze and enhance your API documentation (e.g., API descriptions, parameter descriptions) to optimize it for AI agent usage.
Within a short time, your API will be available as an MCP server on the platform, ready for integration and use by AI agents and other clients.
These features make OmniMCP the most developer-friendly and AI-ready MCP platform available—empowering you to share, deploy, and transform your tools and APIs with unprecedented ease!
mcp-demo
This project is a minimal fastmcp demo, using Python 3.12 and uv for dependency and process management. It provides a simple MCP service with an addition tool, and supports deployment via Docker.
1. Project Overview
- Built with fastmcp framework for MCP protocol support.
- Provides a simple addition tool (
add
). - Uses
uv
for both dependency management and process running. - Exposes the service via HTTP (default:
0.0.0.0:8000/mcp
).
2. Development Environment
- Python 3.12
- uv (for virtual environment, dependency, and process management)
- fastmcp (installed as a dependency)
Install Dependencies (with Version Pinning)
- Create the virtual environment and install dependencies:
- Activate the virtual environment:
- On Unix/macOS:
- On Windows:
After activation, you can use
uv run server.py
or other commands in the virtual environment. - On Unix/macOS:
3. How to Develop and Add New Tools
- Define your tool function in
server.py
using the@mcp.tool()
decorator. For example:
- Start the service, and the tool will be available via MCP clients.
4. How to Start the Service
Activate the virtual environment (see above) and run the server using uv:
The service will be available at http://0.0.0.0:8000/sse
inside the container.
Note:
- The Dockerfile installs
uv
and usesuv run server.py
as the default command, ensuring consistency with local development. - If you add a
requirements.txt
, Docker will use it to install dependencies viauv pip install -r requirements.txt
.
To add more tools, simply extend server.py
with new @mcp.tool()
functions as needed.
5. Docker Deployment
Set Startup Mode in Dockerfile
You can specify the startup mode (transport) directly in the Dockerfile by editing the CMD
instruction. For example, to use stdio mode:
To use SSE mode:
To use streamable-http mode:
With this setup, you can start the container with a simple command:
No extra arguments are needed at runtime. To change the mode, just edit the Dockerfile and rebuild the image.
Build the Docker Image
Run the Container
The service will be available at http://0.0.0.0:8000/sse
inside the container.
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
A minimal fastmcp demonstration server that provides a simple addition tool through the MCP protocol, supporting deployment via Docker with multiple transport modes.
Related MCP Servers
- -securityAlicense-qualityA test server implementing all features of the MCP protocol, including prompts, tools, resources, and sampling, designed for testing MCP clients rather than practical applications.Last updated -PythonMIT License
- -securityFlicense-qualityA collection of MCP servers built with FastMCP framework that handle various tasks including customer interviews, E2E testing, and go-live processes, enabling seamless integration with GitHub Copilot through VSCode.Last updated -Python
- -securityFlicense-qualityA minimal example server implementing the Model Context Protocol, providing addition and multiplication tools for learning and experimentation with MCP clients.Last updated -Python
- -securityFlicense-qualityA demonstration MCP (Model Control Protocol) server built with FastMCP framework that allows integration with Claude Desktop, Cursor, and Claude Code IDEs.Last updated -Python