Skip to main content
Glama

Model Context Protocol Server

Model Context Protocol (MCP) Server

A FastAPI-based server implementing the Model Context Protocol for providing contextual information to AI models. This server acts as a middleware between AI models and various data sources, intelligently routing queries to the most appropriate data provider.

Features

  • Intelligent query routing based on query analysis
  • Support for multiple data sources (Database, GraphQL, REST)
  • Integration with Ollama models (Mistral, Qwen, Llama2)
  • Environment-aware configuration (Development/Production)
  • Comprehensive logging and error handling
  • Health check endpoints
  • Mock data support for development

Prerequisites

  • Python 3.8+
  • Ollama installed and running locally
  • Required Ollama models:
    • mistral
    • qwen
    • llama2

Installation

  1. Clone the repository:
git clone <repository-url> cd mcp-server
  1. Create and activate a virtual environment:
python -m venv .venv source .venv/bin/activate # On Windows: .venv\Scripts\activate
  1. Install dependencies:
pip install -r requirements.txt
  1. Create a .env file:
cp .env.example .env
  1. Update the .env file with your configuration:
ENVIRONMENT=development OLLAMA_MODEL=mistral OLLAMA_BASE_URL=http://localhost:11434

Running the Server

  1. Start Ollama (if not already running):
ollama serve
  1. Start the MCP server:
python main.py

The server will be available at http://localhost:8000

API Endpoints

Get Context

curl -X POST http://localhost:8000/context \ -H "Content-Type: application/json" \ -d '{ "query": "Tell me about iPhone 15", "model": "mistral" }'

List Available Models

curl http://localhost:8000/models

Health Check

curl http://localhost:8000/health

Project Structure

mcp-server/ ├── context_providers/ # Data source providers │ ├── database.py # Database provider │ ├── graphql.py # GraphQL provider │ ├── rest.py # REST API provider │ └── provider_factory.py ├── model_providers/ # AI model providers │ ├── base.py # Base model provider │ ├── ollama.py # Ollama integration │ └── provider_factory.py ├── main.py # FastAPI application ├── query_analyzer.py # Query analysis logic ├── logger_config.py # Logging configuration ├── requirements.txt # Project dependencies └── README.md # Project documentation

Development

Adding New Providers

  1. Create a new provider class in the appropriate directory
  2. Implement the required interface methods
  3. Register the provider in the factory

Adding New Models

  1. Add the model to the AVAILABLE_MODELS dictionary in model_providers/ollama.py
  2. Update the model validation logic if needed

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Commit your changes
  4. Push to the branch
  5. Create a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

-
security - not tested
F
license - not found
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

A middleware server that intelligently routes AI model queries to appropriate data sources, providing contextual information to enhance AI responses.

  1. Features
    1. Prerequisites
      1. Installation
        1. Running the Server
          1. API Endpoints
            1. Get Context
            2. List Available Models
            3. Health Check
          2. Project Structure
            1. Development
              1. Adding New Providers
              2. Adding New Models
            2. Contributing
              1. License

                Related MCP Servers

                • -
                  security
                  F
                  license
                  -
                  quality
                  A server that provides rich UI context and interaction capabilities to AI models, enabling deep understanding of user interfaces through visual analysis and precise interaction via Model Context Protocol.
                  Last updated -
                  24
                  Python
                  • Linux
                  • Apple
                • -
                  security
                  F
                  license
                  -
                  quality
                  A server that enables AI systems to browse, retrieve content from, and interact with web pages through the Model Context Protocol.
                  Last updated -
                • -
                  security
                  F
                  license
                  -
                  quality
                  High-performance server enabling AI assistants to access web scraping, crawling, and deep research capabilities through Model Context Protocol.
                  Last updated -
                  4
                  TypeScript
                • -
                  security
                  A
                  license
                  -
                  quality
                  A middleware system that connects large language models (LLMs) with various tool services through an OpenAI-compatible API, enabling enhanced AI assistant capabilities with features like file operations, web browsing, and database management.
                  Last updated -
                  3
                  Python
                  MIT License

                View all related MCP servers

                MCP directory API

                We provide all the information about MCP servers via our MCP API.

                curl -X GET 'https://glama.ai/api/mcp/v1/servers/Shekharmaheswari85/MCP'

                If you have feedback or need assistance with the MCP directory API, please join our Discord server