Skip to main content
Glama

MCP Server

MIT License

MCP Server - Model Context Protocol Implementation

A comprehensive Python backend implementing the Model Context Protocol (MCP) with JSON-RPC 2.0, Azure OpenAI integration, and Server-Sent Events streaming capabilities.

Features

  • Complete MCP Protocol Support: JSON-RPC 2.0 compliant implementation
  • Azure OpenAI Integration: Seamless connection to Azure OpenAI services
  • Streaming Responses: Real-time streaming via Server-Sent Events (SSE)
  • Resource Management: File system resource discovery and access
  • Tool Execution: Extensible tool registry with validation
  • Authentication: JWT-based authentication system
  • Monitoring: Prometheus metrics collection
  • Web Interface: Built-in testing and management interface

Architecture

├── app/ │ ├── core/ │ │ ├── config.py # Configuration management │ │ ├── errors.py # Custom exception classes │ │ └── logging.py # Structured logging setup │ ├── protocol/ │ │ ├── enums.py # MCP protocol enumerations │ │ └── models.py # Pydantic models for MCP │ ├── services/ │ │ ├── llm.py # Azure OpenAI service │ │ ├── resources.py # Resource management │ │ └── tools.py # Tool registry and execution │ ├── transport/ │ │ └── http.py # HTTP transport layer │ ├── auth.py # JWT authentication │ └── metrics.py # Prometheus metrics ├── static/ │ └── app.js # Frontend JavaScript ├── templates/ │ └── index.html # Web interface ├── main.py # Application entry point └── server.py # Flask app configuration

Installation

  1. Clone the repository:
git clone <repository-url> cd mcp-server
  1. Install dependencies:
pip install -r requirements.txt
  1. Set up environment variables:
# Required for Azure OpenAI export OPENAI_API_KEY="your-azure-openai-api-key" export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com" export AZURE_OPENAI_DEPLOYMENT="your-deployment-name" export AZURE_OPENAI_API_VERSION="2024-08-01-preview" # Optional configurations export JWT_SECRET="your-jwt-secret" export SESSION_SECRET="your-session-secret"

Configuration

The server supports both Azure OpenAI and standard OpenAI configurations:

USE_AZURE_OPENAI = True AZURE_OPENAI_ENDPOINT = "https://your-resource.openai.azure.com" AZURE_OPENAI_DEPLOYMENT = "gpt-4o" AZURE_OPENAI_API_VERSION = "2024-08-01-preview"

Standard OpenAI

USE_AZURE_OPENAI = False OPENAI_MODEL = "gpt-4o"

Running the Server

Development

python main.py

Production

gunicorn --bind 0.0.0.0:5000 --reuse-port --reload main:app

The server will be available at http://localhost:5000

API Endpoints

MCP Protocol

  • POST /rpc - JSON-RPC 2.0 endpoint for MCP requests
  • GET /events - Server-Sent Events for streaming responses

Management

  • GET / - Web interface for testing and management
  • GET /health - Health check endpoint
  • GET /metrics - Prometheus metrics

Authentication

The server uses JWT-based authentication. Include the token in requests:

# HTTP Headers Authorization: Bearer <token> # Query Parameters (for SSE) ?token=<token>

Default development token: devtoken

MCP Protocol Support

Capabilities

  • Resources: File system resource discovery and reading
  • Tools: Extensible tool execution with validation
  • Sampling: LLM completion requests (streaming and non-streaming)
  • Logging: Structured JSON logging

Example Requests

Initialize Connection
{ "jsonrpc": "2.0", "id": "init", "method": "initialize", "params": { "protocolVersion": "2024-11-05", "capabilities": {}, "clientInfo": {"name": "test-client", "version": "1.0.0"} } }
List Resources
{ "jsonrpc": "2.0", "id": "resources", "method": "resources/list", "params": {} }
Execute Tool
{ "jsonrpc": "2.0", "id": "tool", "method": "tools/call", "params": { "name": "calculate", "arguments": {"operation": "add", "a": 5, "b": 3} } }
LLM Completion
{ "jsonrpc": "2.0", "id": "completion", "method": "sampling/createMessage", "params": { "messages": [{"content": {"type": "text", "text": "Hello, world!"}}], "maxTokens": 100 } }

Extending the Server

Adding New Tools

from app.services.tools import mcp_tool @mcp_tool("my_tool", { "type": "object", "properties": { "param1": {"type": "string"}, "param2": {"type": "number"} }, "required": ["param1"] }) async def my_custom_tool(param1: str, param2: float = 0.0): """Custom tool implementation""" return {"result": f"Processed {param1} with {param2}"}

Custom Resource Handlers

from app.services.resources import ResourceService class CustomResourceService(ResourceService): async def list_resources(self, base_path: str = "."): # Custom resource discovery logic pass

Monitoring

The server includes comprehensive monitoring:

  • Prometheus Metrics: Request counts, response times, error rates
  • Structured Logging: JSON-formatted logs with correlation IDs
  • Health Checks: Application and dependency status

Security

  • Environment-based configuration (no hardcoded secrets)
  • JWT authentication with configurable secrets
  • Input validation on all endpoints
  • Rate limiting headers from Azure OpenAI

Development

Running Tests

# Test the API endpoints curl -X POST http://localhost:5000/rpc \ -H "Authorization: Bearer devtoken" \ -H "Content-Type: application/json" \ -d '{"jsonrpc":"2.0","id":"test","method":"initialize","params":{}}' # Test streaming curl -N "http://localhost:5000/events?token=devtoken&prompt=Hello&stream=true"

Adding Dependencies

pip install <package-name> pip freeze > requirements.txt

Troubleshooting

Common Issues

  1. Azure OpenAI Connection Errors
    • Verify AZURE_OPENAI_ENDPOINT and AZURE_OPENAI_DEPLOYMENT
    • Check API key permissions
    • Ensure correct API version
  2. Authentication Failures
    • Verify JWT token format
    • Check token expiration
    • Ensure correct secret configuration
  3. Streaming Issues
    • Use query parameters for SSE authentication
    • Check network connectivity for long-running streams

Debug Logging

Enable debug logging by setting:

export DEBUG=true

License

This project is licensed under the MIT License.

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests for new functionality
  5. Submit a pull request

Support

For issues and questions:

  • Check the troubleshooting section
  • Review the API documentation
  • Open an issue on GitHub
-
security - not tested
A
license - permissive license
-
quality - not tested

A Python backend implementing the Model Context Protocol with Azure OpenAI integration, enabling applications to interact with LLMs through a standardized interface with streaming capabilities.

  1. Features
    1. Architecture
      1. Installation
        1. Configuration
          1. Azure OpenAI (Recommended)
          2. Standard OpenAI
        2. Running the Server
          1. Development
          2. Production
        3. API Endpoints
          1. MCP Protocol
          2. Management
        4. Authentication
          1. MCP Protocol Support
            1. Capabilities
            2. Example Requests
          2. Extending the Server
            1. Adding New Tools
            2. Custom Resource Handlers
          3. Monitoring
            1. Security
              1. Development
                1. Running Tests
                2. Adding Dependencies
              2. Troubleshooting
                1. Common Issues
                2. Debug Logging
              3. License
                1. Contributing
                  1. Support

                    Related MCP Servers

                    • -
                      security
                      A
                      license
                      -
                      quality
                      A minimal server/client application implementation utilizing the Model Context Protocol (MCP) and Azure OpenAI.
                      Last updated -
                      15
                      Python
                      MIT License
                    • A
                      security
                      A
                      license
                      A
                      quality
                      A Model Context Protocol server that enables AI assistants to interact with Azure DevOps resources including projects, work items, repositories, pull requests, branches, and pipelines through a standardized protocol.
                      Last updated -
                      15
                      950
                      257
                      TypeScript
                      MIT License
                      • Linux
                      • Apple
                    • A
                      security
                      A
                      license
                      A
                      quality
                      A Model Context Protocol server that loads multiple OpenAPI specifications and exposes them to LLM-powered IDE integrations, enabling AI to understand and work with your APIs directly in development tools like Cursor.
                      Last updated -
                      7
                      292
                      7
                      TypeScript
                      MIT License
                    • -
                      security
                      -
                      license
                      -
                      quality
                      A Python implementation of the Model Context Protocol that allows applications to provide standardized context for LLMs, enabling creation of servers that expose data and functionality to LLM applications through resources, tools, and prompts.
                      Last updated -
                      Python
                      MIT License

                    View all related MCP servers

                    MCP directory API

                    We provide all the information about MCP servers via our MCP API.

                    curl -X GET 'https://glama.ai/api/mcp/v1/servers/Krishanu-das-05/MCP'

                    If you have feedback or need assistance with the MCP directory API, please join our Discord server