Enables vector search capabilities for AI queries, allowing efficient similarity searches and semantic retrieval of data stored in Elasticsearch indices.
Powers the REST, GraphQL, and WebSocket API interfaces, enabling different methods of interacting with the AI models through standardized endpoints.
Visualizes AI system metrics and performance data, providing dashboards for monitoring model behavior and operational health.
Offers a flexible query language interface for AI interactions, allowing clients to request exactly the data they need from the AI system.
Collects metrics from AI model operations, enabling detailed monitoring of performance, usage patterns, and resource utilization.
Provides caching and real-time data handling for AI operations, enhancing response times and supporting state management across requests.
MCP Server - Model Context Protocol API
MCP Server is a FastAPI-based implementation of the Model Context Protocol (MCP) that provides a standardized interface for interaction between LLM models and applications.
Peculiarities
- 🚀 High-performance API based on FastAPI and asynchronous operations
- 🔄 Full MCP support with resources, instruments, prompts and sampling
- 📊 Monitoring and metrics via Prometheus and Grafana
- 🧩 Extensibility through simple interfaces to add new tools
- 📝 GraphQL API for flexible work with data
- 💬 WebSocket support for real-time interaction
- 🔍 Semantic search via integration with Elasticsearch
- 🗃️ Caching via Redis for improved performance
- 📦 Manage dependencies via Poetry for reliable package management
Getting Started
Installation
- Clone repository:
- Install Poetry (if not already installed):
- Install dependencies via Poetry:
Starting the server
Or via the just utility:
After launch, the API is available at: http://localhost:8000
API Documentation
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
- GraphQL Playground: http://localhost:8000/graphql
Project structure
Available tools
File System Tool
A file system tool that supports reading, writing, deleting and listing files.
Weather Tool
A tool for obtaining weather data by coordinates.
Text Analysis Tool
A tool for text analysis, including sentiment detection and summarization.
Text Processor Tool
A tool for text processing, including formatting, statistics calculation, entity extraction.
Image Processing Tool
An image processing tool that supports resizing, cropping and applying filters.
WebSocket API
To connect to the WebSocket API:
GraphQL API
Examples of queries via GraphQL:
Running tests
To run tests, use Poetry:
Or via the just utility:
Docker
Building and running via Docker Compose
To launch individual services:
Integration with LLM
MCP Server provides a standardized interface for integration with LLM models from various vendors:
Metrics and monitoring
MCP Server provides metrics in Prometheus format via the /metrics
endpoint. Metrics include:
- Number of requests to each tool
- Query execution time
- Errors and exceptions
Development
To format code and check it with linters:
License
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
A high-performance FastAPI server supporting Model Context Protocol (MCP) for seamless integration with Large Language Models, featuring REST, GraphQL, and WebSocket APIs, along with real-time monitoring and vector search capabilities.
- Peculiarities
- Getting Started
- Project structure
- Available tools
- WebSocket API
- GraphQL API
- Running tests
- Docker
- Integration with LLM
- Metrics and monitoring
- Development
- License
Related Resources
Related MCP Servers
- -securityFlicense-qualityA production-ready MCP server built with FastAPI, providing an enhanced tool registry for creating, managing, and documenting AI tools for Large Language Models (LLMs).Last updated -32Python
- -securityAlicense-qualityA Model Context Protocol (MCP) compliant server that allows Large Language Models (LLMs) to search and retrieve content from microCMS APIs.Last updated -TypeScriptMIT License
- -securityAlicense-qualityA high-performance Model Context Protocol (MCP) server designed for large language models, enabling real-time communication between AI models and applications with support for session management and intelligent tool registration.Last updated -2PythonMIT License
- -securityFlicense-qualityA FastAPI server implementing the Model Context Protocol (MCP) for structured tool use, providing utility tools including random number generation, image generation via Azure OpenAI DALL-E, and AI podcast generation.Last updated -Python