remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Integrations
Enables vector search capabilities for AI queries, allowing efficient similarity searches and semantic retrieval of data stored in Elasticsearch indices.
Powers the REST, GraphQL, and WebSocket API interfaces, enabling different methods of interacting with the AI models through standardized endpoints.
Visualizes AI system metrics and performance data, providing dashboards for monitoring model behavior and operational health.
MCP Server - Model Context Protocol API
MCP Server is a FastAPI-based implementation of the Model Context Protocol (MCP) that provides a standardized interface for interaction between LLM models and applications.
Peculiarities
- 🚀 High-performance API based on FastAPI and asynchronous operations
- 🔄 Full MCP support with resources, instruments, prompts and sampling
- 📊 Monitoring and metrics via Prometheus and Grafana
- 🧩 Extensibility through simple interfaces to add new tools
- 📝 GraphQL API for flexible work with data
- 💬 WebSocket support for real-time interaction
- 🔍 Semantic search via integration with Elasticsearch
- 🗃️ Caching via Redis for improved performance
- 📦 Manage dependencies via Poetry for reliable package management
Getting Started
Installation
- Clone repository:Copy
- Install Poetry (if not already installed):Copy
- Install dependencies via Poetry:Copy
Starting the server
Or via the just utility:
After launch, the API is available at: http://localhost:8000
API Documentation
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
- GraphQL Playground: http://localhost:8000/graphql
Project structure
Available tools
File System Tool
A file system tool that supports reading, writing, deleting and listing files.
Weather Tool
A tool for obtaining weather data by coordinates.
Text Analysis Tool
A tool for text analysis, including sentiment detection and summarization.
Text Processor Tool
A tool for text processing, including formatting, statistics calculation, entity extraction.
Image Processing Tool
An image processing tool that supports resizing, cropping and applying filters.
WebSocket API
To connect to the WebSocket API:
GraphQL API
Examples of queries via GraphQL:
Running tests
To run tests, use Poetry:
Or via the just utility:
Docker
Building and running via Docker Compose
To launch individual services:
Integration with LLM
MCP Server provides a standardized interface for integration with LLM models from various vendors:
Metrics and monitoring
MCP Server provides metrics in Prometheus format via the /metrics
endpoint. Metrics include:
- Number of requests to each tool
- Query execution time
- Errors and exceptions
Development
To format code and check it with linters:
License
This server cannot be installed
A high-performance FastAPI server supporting Model Context Protocol (MCP) for seamless integration with Large Language Models, featuring REST, GraphQL, and WebSocket APIs, along with real-time monitoring and vector search capabilities.
- Особенности
- Начало работы
- Структура проекта
- Доступные инструменты
- WebSocket API
- GraphQL API
- Запуск тестов
- Docker
- Интеграция с LLM
- Метрики и мониторинг
- Разработка
- Лицензия