Uses Flask as the web framework to serve the MCP protocol endpoints, web interface, and Server-Sent Events streaming capabilities.
Supports deployment with Gunicorn for production environments with options for binding, port reuse, and hot reloading.
Provides integration with Azure OpenAI services for LLM completions, supporting both streaming and non-streaming responses with configurable models and deployments.
Includes metrics collection for monitoring request counts, response times, and error rates through a dedicated /metrics endpoint.
Implements Pydantic models for MCP protocol validation and type safety throughout the application.
MCP Server - Model Context Protocol Implementation
A comprehensive Python backend implementing the Model Context Protocol (MCP) with JSON-RPC 2.0, Azure OpenAI integration, and Server-Sent Events streaming capabilities.
Features
Complete MCP Protocol Support: JSON-RPC 2.0 compliant implementation
Azure OpenAI Integration: Seamless connection to Azure OpenAI services
Streaming Responses: Real-time streaming via Server-Sent Events (SSE)
Resource Management: File system resource discovery and access
Tool Execution: Extensible tool registry with validation
Authentication: JWT-based authentication system
Monitoring: Prometheus metrics collection
Web Interface: Built-in testing and management interface
Architecture
Installation
Clone the repository:
Install dependencies:
Set up environment variables:
Configuration
The server supports both Azure OpenAI and standard OpenAI configurations:
Azure OpenAI (Recommended)
Standard OpenAI
Running the Server
Development
Production
The server will be available at http://localhost:5000
API Endpoints
MCP Protocol
POST /rpc
- JSON-RPC 2.0 endpoint for MCP requestsGET /events
- Server-Sent Events for streaming responses
Management
GET /
- Web interface for testing and managementGET /health
- Health check endpointGET /metrics
- Prometheus metrics
Authentication
The server uses JWT-based authentication. Include the token in requests:
Default development token: devtoken
MCP Protocol Support
Capabilities
Resources: File system resource discovery and reading
Tools: Extensible tool execution with validation
Sampling: LLM completion requests (streaming and non-streaming)
Logging: Structured JSON logging
Example Requests
Initialize Connection
List Resources
Execute Tool
LLM Completion
Extending the Server
Adding New Tools
Custom Resource Handlers
Monitoring
The server includes comprehensive monitoring:
Prometheus Metrics: Request counts, response times, error rates
Structured Logging: JSON-formatted logs with correlation IDs
Health Checks: Application and dependency status
Security
Environment-based configuration (no hardcoded secrets)
JWT authentication with configurable secrets
Input validation on all endpoints
Rate limiting headers from Azure OpenAI
Development
Running Tests
Adding Dependencies
Troubleshooting
Common Issues
Azure OpenAI Connection Errors
Verify
AZURE_OPENAI_ENDPOINT
andAZURE_OPENAI_DEPLOYMENT
Check API key permissions
Ensure correct API version
Authentication Failures
Verify JWT token format
Check token expiration
Ensure correct secret configuration
Streaming Issues
Use query parameters for SSE authentication
Check network connectivity for long-running streams
Debug Logging
Enable debug logging by setting:
License
This project is licensed under the MIT License.
Contributing
Fork the repository
Create a feature branch
Make your changes
Add tests for new functionality
Submit a pull request
Support
For issues and questions:
Check the troubleshooting section
Review the API documentation
Open an issue on GitHub
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
A Python backend implementing the Model Context Protocol with Azure OpenAI integration, enabling applications to interact with LLMs through a standardized interface with streaming capabilities.
Related MCP Servers
- AsecurityAlicenseAqualityA Model Context Protocol server that loads multiple OpenAPI specifications and exposes them to LLM-powered IDE integrations, enabling AI to understand and work with your APIs directly in development tools like Cursor.Last updated -714564MIT License
- -securityAlicense-qualityA Python implementation of the Model Context Protocol that allows applications to provide standardized context for LLMs, enabling creation of servers that expose data and functionality to LLM applications through resources, tools, and prompts.Last updated -MIT License
- -securityFlicense-qualityA unified Model Context Protocol Gateway that bridges LLM interfaces with various tools and services, providing OpenAI API compatibility and supporting both synchronous and asynchronous tool execution.Last updated -1
- AsecurityFlicenseAqualityA Model Context Protocol server that allows LLMs to interact with Python environments, enabling code execution, file operations, package management, and development workflows.Last updated -9