MyAIServ MCP Server

by eagurin
Verified

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

Integrations

  • Enables vector search capabilities for AI queries, allowing efficient similarity searches and semantic retrieval of data stored in Elasticsearch indices.

  • Powers the REST, GraphQL, and WebSocket API interfaces, enabling different methods of interacting with the AI models through standardized endpoints.

  • Visualizes AI system metrics and performance data, providing dashboards for monitoring model behavior and operational health.

MCP Server - Model Context Protocol API

MCP Server is a FastAPI-based implementation of the Model Context Protocol (MCP) that provides a standardized interface for interaction between LLM models and applications.

Peculiarities

  • 🚀 High-performance API based on FastAPI and asynchronous operations
  • 🔄 Full MCP support with resources, instruments, prompts and sampling
  • 📊 Monitoring and metrics via Prometheus and Grafana
  • 🧩 Extensibility through simple interfaces to add new tools
  • 📝 GraphQL API for flexible work with data
  • 💬 WebSocket support for real-time interaction
  • 🔍 Semantic search via integration with Elasticsearch
  • 🗃️ Caching via Redis for improved performance
  • 📦 Manage dependencies via Poetry for reliable package management

Getting Started

Installation

  1. Clone repository:
    git clone https://github.com/yourusername/myaiserv.git cd myaiserv
  2. Install Poetry (if not already installed):
    curl -sSL https://install.python-poetry.org | python3 -
  3. Install dependencies via Poetry:
    poetry install

Starting the server

poetry run uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload

Or via the just utility:

just run

After launch, the API is available at: http://localhost:8000

API Documentation

Project structure

myaiserv/ ├── app/ │ ├── core/ # Базовые компоненты MCP │ │ ├── base_mcp.py # Абстрактные классы MCP │ │ └── base_sampling.py # Базовые классы для сэмплирования │ ├── models/ # Pydantic модели │ │ ├── mcp.py # Модели данных MCP │ │ └── graphql.py # GraphQL схема │ ├── services/ # Бизнес-логика │ │ └── mcp_service.py # Сервис MCP │ ├── storage/ # Хранилище данных │ ├── tools/ # Инструменты MCP │ │ ├── example_tool.py # Примеры инструментов │ │ └── text_processor.py # Инструмент обработки текста │ ├── utils/ # Утилиты │ └── main.py # Точка входа FastAPI ├── app/tests/ # Тесты ├── docs/ # Документация │ └── MCP_API.md # Описание API ├── pyproject.toml # Конфигурация Poetry и инструментов └── .justfile # Задачи для утилиты just

Available tools

File System Tool

A file system tool that supports reading, writing, deleting and listing files.

curl -X POST "http://localhost:8000/tools/file_operations" \ -H "Content-Type: application/json" \ -d '{"operation": "list", "path": "."}'

Weather Tool

A tool for obtaining weather data by coordinates.

curl -X POST "http://localhost:8000/tools/weather" \ -H "Content-Type: application/json" \ -d '{"latitude": 37.7749, "longitude": -122.4194}'

Text Analysis Tool

A tool for text analysis, including sentiment detection and summarization.

curl -X POST "http://localhost:8000/tools/text_analysis" \ -H "Content-Type: application/json" \ -d '{"text": "Example text for analysis", "analysis_type": "sentiment"}'

Text Processor Tool

A tool for text processing, including formatting, statistics calculation, entity extraction.

curl -X POST "http://localhost:8000/tools/text_processor" \ -H "Content-Type: application/json" \ -d '{"operation": "statistics", "text": "Example text", "stat_options": ["chars", "words"]}'

Image Processing Tool

An image processing tool that supports resizing, cropping and applying filters.

curl -X POST "http://localhost:8000/tools/image_processing" \ -H "Content-Type: application/json" \ -d '{"operation": "resize", "image_data": "base64...", "params": {"width": 800, "height": 600}}'

WebSocket API

To connect to the WebSocket API:

const socket = new WebSocket("ws://localhost:8000/ws"); socket.onopen = () => { socket.send(JSON.stringify({ type: "initialize", id: "my-request-id" })); }; socket.onmessage = (event) => { const data = JSON.parse(event.data); console.log("Received:", data); };

GraphQL API

Examples of queries via GraphQL:

# Получение списка всех инструментов query { getTools { name description } } # Выполнение инструмента mutation { executeTool(input: { name: "text_processor", parameters: { operation: "statistics", text: "Example text for analysis" } }) { content { type text } is_error } }

Running tests

To run tests, use Poetry:

poetry run pytest

Or via the just utility:

just test

Docker

Building and running via Docker Compose

docker compose up -d

To launch individual services:

docker compose up -d web redis elasticsearch

Integration with LLM

MCP Server provides a standardized interface for integration with LLM models from various vendors:

import httpx async def query_mcp_with_llm(prompt: str): async with httpx.AsyncClient() as client: # Запрос к MCP для получения контекста и инструментов tools_response = await client.get("http://localhost:8000/tools") tools = tools_response.json()["tools"] # Отправка запроса к LLM с включением MCP контекста llm_response = await client.post( "https://api.example-llm.com/v1/chat", json={ "messages": [ {"role": "system", "content": "You have access to the following tools:"}, {"role": "user", "content": prompt} ], "tools": tools, "tool_choice": "auto" } ) return llm_response.json()

Metrics and monitoring

MCP Server provides metrics in Prometheus format via the /metrics endpoint. Metrics include:

  • Number of requests to each tool
  • Query execution time
  • Errors and exceptions

Development

To format code and check it with linters:

just fmt just lint

License

MIT License

ID: xw8u4yj92j