Multi-LLM Gateway MCP
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Multi-LLM Gateway MCPSummarize this text using Claude with fallback to OpenAI"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
MCP Server
Multi-LLM Gateway para mcp.observabilidadebrasil.org
Um servidor MCP (Model Context Protocol) que atua como gateway inteligente para mΓΊltiplos backends LLM, com suporte a streaming, rate limiting, e monitoramento.
ποΈ Arquitetura
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β nginx (SSL + Rate Limit) β
β mcp.observabilidadebrasil.org β
βββββββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββββ
β
βΌ
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β MCP Server (FastAPI) β
β Port 9200 β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β βββββββββββ βββββββββββ βββββββββββ βββββββββββ β
β β OpenAI β β Claude β β Ollama β β Custom β β
β β Providerβ β Providerβ β Providerβ β Providerβ β
β ββββββ¬βββββ ββββββ¬βββββ ββββββ¬βββββ ββββββ¬βββββ β
β β β β β β
β ββββββββββββββ΄βββββββββββββ΄βββββββββββββ β
β β β
β ββββββββββββΌβββββββββββ β
β β LLM Router β β
β β (load balance, β β
β β fallback, routing) β β
β βββββββββββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββπ Features
Multi-LLM Backend: Suporte a OpenAI, Anthropic Claude, Ollama (local), e providers customizados
Streaming SSE: Respostas em tempo real via Server-Sent Events
Rate Limiting: ProteΓ§Γ£o contra abuso (nginx + aplicaΓ§Γ£o)
Fallback AutomΓ‘tico: Se um provider falhar, tenta o prΓ³ximo
Monitoramento: Dashboard separado de requests, mΓ©todos, e abuse
Health Checks: Endpoints de saΓΊde para cada provider
Docker Ready: Deploy simplificado com Docker Compose
π¦ InstalaΓ§Γ£o
Requisitos
Python 3.11+
nginx (para produΓ§Γ£o)
Docker (opcional)
Desenvolvimento Local
# Clonar repositΓ³rio
git clone https://github.com/tgosoul2019/mcp.git
cd mcp
# Criar virtual environment
python3 -m venv .venv
source .venv/bin/activate
# Instalar dependΓͺncias
pip install -e ".[dev]"
# Configurar variΓ‘veis de ambiente
cp .env.example .env
# Editar .env com suas API keys
# Rodar servidor
python -m mcp_serverProduΓ§Γ£o (VPS)
# No servidor
cd /dados
git clone https://github.com/tgosoul2019/mcp.git
cd mcp
# Setup
./scripts/setup.sh
# Iniciar serviΓ§o
sudo systemctl start mcp-serverβοΈ ConfiguraΓ§Γ£o
VariΓ‘veis de Ambiente
# Server
MCP_HOST=127.0.0.1
MCP_PORT=9200
MCP_DEBUG=false
# LLM Providers (configure apenas os que usar)
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
OLLAMA_BASE_URL=http://localhost:11434
# Default Provider
MCP_DEFAULT_PROVIDER=openai
# Rate Limiting (aplicaΓ§Γ£o)
MCP_RATE_LIMIT_REQUESTS=100
MCP_RATE_LIMIT_WINDOW=60
# Logging
MCP_LOG_LEVEL=INFO
MCP_LOG_FILE=/var/log/mcp/mcp.logπ API Endpoints
Chat Completion
POST /v1/chat/completions
Content-Type: application/json
{
"model": "gpt-4",
"messages": [
{"role": "user", "content": "Hello!"}
],
"stream": true,
"provider": "openai" # opcional, usa default se omitido
}Health Check
GET /health
GET /health/providersMetrics
GET /metricsπ Monitoramento
O MCP tem seu prΓ³prio dashboard de monitoramento separado do KCP:
URL:
https://mcp.observabilidadebrasil.org/admin/monitorRequests por provider
LatΓͺncia mΓ©dia
Taxa de erros
IPs mais ativos
Abuse detection
π³ Docker
# Build
docker build -t mcp-server .
# Run
docker run -d \
--name mcp-server \
-p 9200:9200 \
-e OPENAI_API_KEY=sk-... \
mcp-serverπ Estrutura do Projeto
mcp/
βββ mcp_server/
β βββ __init__.py
β βββ __main__.py
β βββ app.py # FastAPI app
β βββ config.py # ConfiguraΓ§Γ΅es
β βββ router.py # LLM Router
β βββ providers/
β β βββ __init__.py
β β βββ base.py # Abstract Provider
β β βββ openai.py
β β βββ anthropic.py
β β βββ ollama.py
β βββ middleware/
β β βββ __init__.py
β β βββ rate_limit.py
β β βββ logging.py
β βββ monitor/
β βββ __init__.py
β βββ collector.py # MΓ©tricas
β βββ dashboard.py # UI
βββ infra/
β βββ nginx/
β β βββ mcp.conf
β βββ systemd/
β β βββ mcp-server.service
β βββ docker/
β βββ Dockerfile
β βββ docker-compose.yml
βββ scripts/
β βββ setup.sh
β βββ deploy.sh
βββ tests/
βββ pyproject.toml
βββ .env.example
βββ README.mdπ LicenΓ§a
MIT
π Links
ProduΓ§Γ£o: https://mcp.observabilidadebrasil.org
RepositΓ³rio: https://github.com/tgosoul2019/mcp
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/tgosoul2019/mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server