Skip to main content
Glama

FastMCP

by Dev00355

FastMCP - Model Context Protocol Server

FastMCP is a Model Context Protocol (MCP) server that provides LLM services through the MCP standard. It acts as a bridge between MCP clients and your local LLM service, enabling seamless integration with MCP-compatible applications.

Features

  • 🚀 MCP Protocol Compliance: Full implementation of Model Context Protocol
  • 🔧 Tools: Chat completion, model listing, health checks
  • 📝 Prompts: Pre-built prompts for common tasks (assistant, code review, summarization)
  • 📊 Resources: Server configuration and LLM service status
  • 🔄 Streaming Support: Both streaming and non-streaming responses
  • 🔒 Configurable: Environment-based configuration
  • 🛡️ Robust: Built-in error handling and health monitoring
  • 🔌 Integration Ready: Works with any OpenAI-compatible LLM service

Getting Started

Prerequisites

  • Python 3.9+
  • pip
  • Local LLM service running on port 5001 (OpenAI-compatible API)
  • MCP client (e.g., Claude Desktop, MCP Inspector)

Installation

  1. Clone the repository:
    git clone https://github.com/yourusername/fastmcp.git cd fastmcp
  2. Create a virtual environment and activate it:
    python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
  3. Install dependencies:
    pip install -r requirements.txt
  4. Create a .env file (copy from .env.mcp) and configure:
    # Server Settings MCP_SERVER_NAME=fastmcp-llm-router MCP_SERVER_VERSION=0.1.0 # LLM Service Configuration LOCAL_LLM_SERVICE_URL=http://localhost:5001 # Optional: API Key for LLM service # LLM_SERVICE_API_KEY=your_api_key_here # Timeouts (in seconds) LLM_REQUEST_TIMEOUT=60 HEALTH_CHECK_TIMEOUT=10 # Logging LOG_LEVEL=INFO

Running the MCP Server

Option 1: Using the CLI script
python run_server.py
Option 2: Direct execution
python mcp_server.py
Option 3: With custom configuration
python run_server.py --llm-url http://localhost:5001 --log-level DEBUG

The MCP server will run on stdio and can be connected to by MCP clients.

MCP Client Integration

Claude Desktop Integration

Add to your Claude Desktop configuration:

{ "mcpServers": { "fastmcp-llm-router": { "command": "python", "args": ["/path/to/fastmcp/mcp_server.py"], "env": { "LOCAL_LLM_SERVICE_URL": "http://localhost:5001" } } } }

MCP Inspector

Test your server with MCP Inspector:

npx @modelcontextprotocol/inspector python mcp_server.py

Available Tools

1. Chat Completion

Send messages to your LLM service:

{ "name": "chat_completion", "arguments": { "messages": [ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "Hello!"} ], "model": "default", "temperature": 0.7 } }

2. List Models

Get available models from your LLM service:

{ "name": "list_models", "arguments": {} }

3. Health Check

Check if your LLM service is running:

{ "name": "health_check", "arguments": {} }

Available Prompts

  • chat_assistant: General AI assistant prompt
  • code_review: Code review and analysis
  • summarize: Text summarization

Available Resources

  • config://server: Server configuration
  • status://llm-service: LLM service status

Project Structure

fastmcp/ ├── app/ │ ├── api/ │ │ └── v1/ │ │ └── api.py # API routes │ ├── core/ │ │ └── config.py # Application configuration │ ├── models/ # Database models │ ├── services/ # Business logic │ └── utils/ # Utility functions ├── tests/ # Test files ├── .env.example # Example environment variables ├── requirements.txt # Project dependencies └── README.md # This file

Contributing

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

-
security - not tested
F
license - not found
-
quality - not tested

local-only server

The server can only run on the client's local machine because it depends on local resources.

A Model Context Protocol server that bridges MCP clients with local LLM services, enabling seamless integration with MCP-compatible applications through standard tools like chat completion, model listing, and health checks.

  1. Features
    1. Getting Started
      1. Prerequisites
      2. Installation
      3. Running the MCP Server
    2. MCP Client Integration
      1. Claude Desktop Integration
      2. MCP Inspector
    3. Available Tools
      1. 1. Chat Completion
      2. 2. List Models
      3. 3. Health Check
    4. Available Prompts
      1. Available Resources
        1. Project Structure
          1. Contributing
            1. License

              Related MCP Servers

              • A
                security
                A
                license
                A
                quality
                An educational implementation of a Model Context Protocol server that demonstrates how to build a functional MCP server for integrating with various LLM clients like Claude Desktop.
                Last updated -
                1
                120
                Python
                MIT License
                • Apple
                • Linux
              • -
                security
                A
                license
                -
                quality
                A Model Context Protocol (MCP) server implementation that enables LLMs to interact with the Osmosis protocol, allowing for querying and transaction functionality through natural language.
                Last updated -
                9
                TypeScript
                MIT License
                • Apple
              • A
                security
                A
                license
                A
                quality
                A Model Context Protocol (MCP) server designed to easily dump your codebase context into Large Language Models (LLMs).
                Last updated -
                1
                9
                1
                JavaScript
                Apache 2.0
              • -
                security
                F
                license
                -
                quality
                A ready-to-use starter implementation of the Model Context Protocol (MCP) server that enables applications to provide standardized context for LLMs with sample resources, tools, and prompts.
                Last updated -
                TypeScript

              View all related MCP servers

              MCP directory API

              We provide all the information about MCP servers via our MCP API.

              curl -X GET 'https://glama.ai/api/mcp/v1/servers/Dev00355/custom-mcp'

              If you have feedback or need assistance with the MCP directory API, please join our Discord server