Supports environment-aware configuration through .env files, allowing customization of server behavior based on development or production environments.
Built on FastAPI framework, providing performant REST endpoints for context retrieval, model listing, and health checking.
Enables querying GraphQL endpoints as a data source, allowing the MCP server to extract contextual information from GraphQL APIs.
Integrates with Ollama's open source LLM models (Mistral, Qwen, Llama2), enabling the MCP server to analyze queries and retrieve relevant contextual information for these models.
Model Context Protocol (MCP) Server
A FastAPI-based server implementing the Model Context Protocol for providing contextual information to AI models. This server acts as a middleware between AI models and various data sources, intelligently routing queries to the most appropriate data provider.
Features
- Intelligent query routing based on query analysis
- Support for multiple data sources (Database, GraphQL, REST)
- Integration with Ollama models (Mistral, Qwen, Llama2)
- Environment-aware configuration (Development/Production)
- Comprehensive logging and error handling
- Health check endpoints
- Mock data support for development
Prerequisites
- Python 3.8+
- Ollama installed and running locally
- Required Ollama models:
- mistral
- qwen
- llama2
Installation
- Clone the repository:
- Create and activate a virtual environment:
- Install dependencies:
- Create a
.env
file:
- Update the
.env
file with your configuration:
Running the Server
- Start Ollama (if not already running):
- Start the MCP server:
The server will be available at http://localhost:8000
API Endpoints
Get Context
List Available Models
Health Check
Project Structure
Development
Adding New Providers
- Create a new provider class in the appropriate directory
- Implement the required interface methods
- Register the provider in the factory
Adding New Models
- Add the model to the
AVAILABLE_MODELS
dictionary inmodel_providers/ollama.py
- Update the model validation logic if needed
Contributing
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Create a Pull Request
License
This project is licensed under the MIT License - see the LICENSE file for details.
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
A middleware server that intelligently routes AI model queries to appropriate data sources, providing contextual information to enhance AI responses.
Related MCP Servers
- -securityFlicense-qualityA server that provides rich UI context and interaction capabilities to AI models, enabling deep understanding of user interfaces through visual analysis and precise interaction via Model Context Protocol.Last updated -24Python
- -securityFlicense-qualityA server that enables AI systems to browse, retrieve content from, and interact with web pages through the Model Context Protocol.Last updated -
- -securityFlicense-qualityHigh-performance server enabling AI assistants to access web scraping, crawling, and deep research capabilities through Model Context Protocol.Last updated -4TypeScript
- -securityAlicense-qualityA middleware system that connects large language models (LLMs) with various tool services through an OpenAI-compatible API, enabling enhanced AI assistant capabilities with features like file operations, web browsing, and database management.Last updated -3PythonMIT License