Works with any OpenAI-compatible LLM service, providing a bridge to access local LLM models through the standard OpenAI API format
FastMCP - Model Context Protocol Server
FastMCP is a Model Context Protocol (MCP) server that provides LLM services through the MCP standard. It acts as a bridge between MCP clients and your local LLM service, enabling seamless integration with MCP-compatible applications.
Features
š MCP Protocol Compliance: Full implementation of Model Context Protocol
š§ Tools: Chat completion, model listing, health checks
š Prompts: Pre-built prompts for common tasks (assistant, code review, summarization)
š Resources: Server configuration and LLM service status
š Streaming Support: Both streaming and non-streaming responses
š Configurable: Environment-based configuration
š”ļø Robust: Built-in error handling and health monitoring
š Integration Ready: Works with any OpenAI-compatible LLM service
Getting Started
Prerequisites
Python 3.9+
pip
Local LLM service running on port 5001 (OpenAI-compatible API)
MCP client (e.g., Claude Desktop, MCP Inspector)
Installation
Clone the repository:
git clone https://github.com/yourusername/fastmcp.git cd fastmcpCreate a virtual environment and activate it:
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activateInstall dependencies:
pip install -r requirements.txtCreate a
.env
file (copy from.env.mcp
) and configure:# Server Settings MCP_SERVER_NAME=fastmcp-llm-router MCP_SERVER_VERSION=0.1.0 # LLM Service Configuration LOCAL_LLM_SERVICE_URL=http://localhost:5001 # Optional: API Key for LLM service # LLM_SERVICE_API_KEY=your_api_key_here # Timeouts (in seconds) LLM_REQUEST_TIMEOUT=60 HEALTH_CHECK_TIMEOUT=10 # Logging LOG_LEVEL=INFO
Running the MCP Server
Option 1: Using the CLI script
Option 2: Direct execution
Option 3: With custom configuration
The MCP server will run on stdio and can be connected to by MCP clients.
MCP Client Integration
Claude Desktop Integration
Add to your Claude Desktop configuration:
MCP Inspector
Test your server with MCP Inspector:
Available Tools
1. Chat Completion
Send messages to your LLM service:
2. List Models
Get available models from your LLM service:
3. Health Check
Check if your LLM service is running:
Available Prompts
chat_assistant: General AI assistant prompt
code_review: Code review and analysis
summarize: Text summarization
Available Resources
config://server: Server configuration
status://llm-service: LLM service status
Project Structure
Contributing
Fork the repository
Create your feature branch (
git checkout -b feature/amazing-feature
)Commit your changes (
git commit -m 'Add some amazing feature'
)Push to the branch (
git push origin feature/amazing-feature
)Open a Pull Request
License
This project is licensed under the MIT License - see the LICENSE file for details.
This server cannot be installed
local-only server
The server can only run on the client's local machine because it depends on local resources.
A Model Context Protocol server that bridges MCP clients with local LLM services, enabling seamless integration with MCP-compatible applications through standard tools like chat completion, model listing, and health checks.
Related MCP Servers
- AsecurityAlicenseAqualityAn educational implementation of a Model Context Protocol server that demonstrates how to build a functional MCP server for integrating with various LLM clients like Claude Desktop.Last updated -1135MIT License
- -securityAlicense-qualityA Model Context Protocol (MCP) server implementation that enables LLMs to interact with the Osmosis protocol, allowing for querying and transaction functionality through natural language.Last updated -9MIT License
- AsecurityAlicenseAqualityA Model Context Protocol (MCP) server designed to easily dump your codebase context into Large Language Models (LLMs).Last updated -1242Apache 2.0
- AsecurityFlicenseAqualityA ready-to-use starter implementation of the Model Context Protocol (MCP) server that enables applications to provide standardized context for LLMs with sample resources, tools, and prompts.Last updated -21