Allows using OpenAI's models for text generation via the Model Context Protocol, requiring an API key for authentication.
Provides type safety throughout the application with Pydantic validation for request and response data.
Stores conversation history in a SQLite database with WAL mode and optimized indexes for high performance.
Uses YAML for configuration of LLM providers and settings in the src/config.yaml file.
MCP Backend OpenRouter
A high-performance chatbot platform connecting MCP servers with LLM APIs for intelligent tool execution.
🚀 Quick Start
Connect: ws://localhost:8000/ws/chat
📡 WebSocket API
Send Messages
Receive Responses
Message Types
Type | Purpose | Payload |
---|---|---|
user_message | Send user input | {"type": "user_message", "message": "text"} |
clear_history | Start new session | {"type": "clear_history"} |
assistant_message | AI response | {"type": "assistant_message", "message": "text", "thinking": "reasoning"} |
tool_execution | Tool status | {"type": "tool_execution", "tool_name": "name", "status": "executing"} |
⚙️ Configuration
Essential Settings (src/runtime_config.yaml
)
MCP Servers (servers_config.json
)
🔧 Performance Tuning
Streaming Optimization
HTTP/2 Support
🛠️ Development
Code Standards
- Use
uv
for package management - Pydantic for data validation
- Type hints required
- Fail-fast error handling
Available Scripts
Code Formatting
📁 Project Structure
🔑 Environment Variables
🚨 Troubleshooting
Common Issues
Problem | Solution |
---|---|
Configuration not updating | Check file permissions on runtime_config.yaml |
WebSocket connection fails | Verify server is running and port is correct |
MCP server errors | Check servers_config.json and server availability |
LLM API issues | Verify API keys and model configuration |
Debug Mode
Component Testing
✅ Features
- Full MCP Protocol - Tools, prompts, resources
- High Performance - SQLite with WAL mode, optimized indexes
- Real-time Streaming - WebSocket with delta persistence
- Multi-Provider - OpenRouter (100+ models), OpenAI, Groq
- Type Safe - Pydantic validation throughout
- Dynamic Configuration - Runtime changes without restart
- Auto-Persistence - Automatic conversation storage
📚 Quick Reference
Command | Purpose |
---|---|
uv run python src/main.py | Start the platform |
uv run mcp-reset-config | Reset to default config |
Edit runtime_config.yaml | Change settings (auto-reload) |
Edit servers_config.json | Configure MCP servers |
🆘 Support
- Check logs for detailed error messages
- Verify configuration syntax with YAML validator
- Test individual components for isolation
- Monitor WebSocket connections and database size
Requirements: Python 3.13+, uv
package manager
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
High-performance Model Context Protocol server supporting multiple LLM providers (OpenRouter, OpenAI, Groq) with WebSocket API and conversation history persistence.
Related MCP Servers
- -securityAlicense-qualityA Model Context Protocol server that enables LLMs to interact with databases (currently MongoDB) through natural language, supporting operations like querying, inserting, deleting documents, and running aggregation pipelines.Last updated -MIT License
- AsecurityFlicenseAqualityA Model Context Protocol server that enables LLMs to explore and interact with API specifications by providing tools for loading, browsing, and getting detailed information about API endpoints.Last updated -41213
- AsecurityAlicenseAqualityA Model Context Protocol server that provides knowledge graph-based persistent memory for LLMs, allowing them to store, retrieve, and reason about information across multiple conversations and sessions.Last updated -994,2662MIT License
- -securityAlicense-qualityA Model Context Protocol server that allows saving, retrieving, adding, and clearing memories from LLM conversations with MongoDB persistence.Last updated -112MIT License