Skip to main content
Glama

FastMCP - Model Context Protocol Server

FastMCP is a Model Context Protocol (MCP) server that provides LLM services through the MCP standard. It acts as a bridge between MCP clients and your local LLM service, enabling seamless integration with MCP-compatible applications.

Features

  • šŸš€ MCP Protocol Compliance: Full implementation of Model Context Protocol

  • šŸ”§ Tools: Chat completion, model listing, health checks

  • šŸ“ Prompts: Pre-built prompts for common tasks (assistant, code review, summarization)

  • šŸ“Š Resources: Server configuration and LLM service status

  • šŸ”„ Streaming Support: Both streaming and non-streaming responses

  • šŸ”’ Configurable: Environment-based configuration

  • šŸ›”ļø Robust: Built-in error handling and health monitoring

  • šŸ”Œ Integration Ready: Works with any OpenAI-compatible LLM service

Related MCP server: Osmosis

Getting Started

Prerequisites

  • Python 3.9+

  • pip

  • Local LLM service running on port 5001 (OpenAI-compatible API)

  • MCP client (e.g., Claude Desktop, MCP Inspector)

Installation

  1. Clone the repository:

    git clone https://github.com/yourusername/fastmcp.git cd fastmcp
  2. Create a virtual environment and activate it:

    python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
  3. Install dependencies:

    pip install -r requirements.txt
  4. Create a .env file (copy from .env.mcp) and configure:

    # Server Settings MCP_SERVER_NAME=fastmcp-llm-router MCP_SERVER_VERSION=0.1.0 # LLM Service Configuration LOCAL_LLM_SERVICE_URL=http://localhost:5001 # Optional: API Key for LLM service # LLM_SERVICE_API_KEY=your_api_key_here # Timeouts (in seconds) LLM_REQUEST_TIMEOUT=60 HEALTH_CHECK_TIMEOUT=10 # Logging LOG_LEVEL=INFO

Running the MCP Server

Option 1: Using the CLI script

python run_server.py

Option 2: Direct execution

python mcp_server.py

Option 3: With custom configuration

python run_server.py --llm-url http://localhost:5001 --log-level DEBUG

The MCP server will run on stdio and can be connected to by MCP clients.

MCP Client Integration

Claude Desktop Integration

Add to your Claude Desktop configuration:

{ "mcpServers": { "fastmcp-llm-router": { "command": "python", "args": ["/path/to/fastmcp/mcp_server.py"], "env": { "LOCAL_LLM_SERVICE_URL": "http://localhost:5001" } } } }

MCP Inspector

Test your server with MCP Inspector:

npx @modelcontextprotocol/inspector python mcp_server.py

Available Tools

1. Chat Completion

Send messages to your LLM service:

{ "name": "chat_completion", "arguments": { "messages": [ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "Hello!"} ], "model": "default", "temperature": 0.7 } }

2. List Models

Get available models from your LLM service:

{ "name": "list_models", "arguments": {} }

3. Health Check

Check if your LLM service is running:

{ "name": "health_check", "arguments": {} }

Available Prompts

  • chat_assistant: General AI assistant prompt

  • code_review: Code review and analysis

  • summarize: Text summarization

Available Resources

  • config://server: Server configuration

  • status://llm-service: LLM service status

Project Structure

fastmcp/ ā”œā”€ā”€ app/ │ ā”œā”€ā”€ api/ │ │ └── v1/ │ │ └── api.py # API routes │ ā”œā”€ā”€ core/ │ │ └── config.py # Application configuration │ ā”œā”€ā”€ models/ # Database models │ ā”œā”€ā”€ services/ # Business logic │ └── utils/ # Utility functions ā”œā”€ā”€ tests/ # Test files ā”œā”€ā”€ .env.example # Example environment variables ā”œā”€ā”€ requirements.txt # Project dependencies └── README.md # This file

Contributing

  1. Fork the repository

  2. Create your feature branch (git checkout -b feature/amazing-feature)

  3. Commit your changes (git commit -m 'Add some amazing feature')

  4. Push to the branch (git push origin feature/amazing-feature)

  5. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Dev00355/custom-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server