Awesome MCP FastAPI
Awesome MCP FastAPI
A powerful FastAPI-based implementation of the Model Context Protocol (MCP) with enhanced tool registry capabilities, leveraging the mature FastAPI ecosystem.
Overview
Awesome MCP FastAPI is a production-ready implementation of the Model Context Protocol that enhances and extends the standard MCP functionality by integrating it with FastAPI's robust ecosystem. This project provides an improved tool registry system that makes it easier to create, manage, and document AI tools for Large Language Models (LLMs).
Why This Is Better Than Standard MCP
While the Model Context Protocol provides a solid foundation for connecting AI models with tools and data sources, our implementation offers several significant advantages:
FastAPI's Mature Ecosystem
- Production-Ready Web Framework: Built on FastAPI, a high-performance, modern web framework with automatic OpenAPI documentation generation.
- Dependency Injection: Leverage FastAPI's powerful dependency injection system for more maintainable and testable code.
- Middleware Support: Easy integration with authentication, monitoring, and other middleware components.
- Built-in Validation: Pydantic integration for robust request/response validation and data modeling.
- Async Support: First-class support for async/await patterns for high-concurrency applications.
Enhanced Tool Registry
Our implementation improves upon the standard MCP tool registry by:
- Automatic Documentation Generation: Tools are automatically documented in both MCP format and OpenAPI specification.
- Improved Type Hints: Enhanced type information extraction for better tooling and IDE support.
- Richer Schema Definitions: More expressive JSON Schema definitions for tool inputs and outputs.
- Better Error Handling: Structured error responses with detailed information.
- Enhanced Docstring Support: Better extraction of documentation from Python docstrings.
Additional Features
- CORS Support: Ready for cross-origin requests, making it easy to integrate with web applications.
- Lifespan Management: Proper resource initialization and cleanup through FastAPI's lifespan API.
Getting Started
Prerequisites
- Python 3.10+
Installation
Running the Server
Visit http://localhost:8000/docs to see the OpenAPI documentation.
Usage
Creating a Tool
Accessing Tools Through MCP
LLMs can discover and use your tools through the Model Context Protocol. Example using Claude:
Claude will automatically find and use your calculator tool to perform the calculation.
Architecture
Our application follows a modular architecture:
Docker Support
Build and run with Docker:
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
This project is licensed under the MIT License - see the LICENSE file for details.
This server cannot be installed
A production-ready MCP server built with FastAPI, providing an enhanced tool registry for creating, managing, and documenting AI tools for Large Language Models (LLMs).