Enables containerized deployment of the MCP server with Docker and Docker Compose, allowing for easier setup and distribution across machines
Powers the backend API server that handles routing user questions to appropriate specialized agents
Integrates with Ollama to access the deepseek-r1:7b language model for expert responses when specialized agents cannot handle a query
🧠 MCP: Multi-Agent Control Point
This project implements a multi-agent server that routes user queries to an LLM model or specialized agents (such as date, location, weather, or a technical expert). It includes a simple web interface built with Streamlit for ease of use.
🚀 Features
- 🌐 Backend with FastAPI
- 🧠 Specialized agents (date, location, weather, LLM expert)
- 🖥️ Visual interface with Streamlit (GUI)
- 🐳 Docker containers for easy deployment
- 🔌 Client-server communication ready for local or remote network
📁 Project structure
⚙️ Requirements
🧪 Quick installation
1. Clone the repository
2. Create configuration file for Streamlit
Inside the gui
directory, create the file:
With the following content:
3. Run with Docker Compose
This will build and lift two containers:
- Backend at
http://localhost:8000
- Graphical interface at
http://localhost:8501
🌍 Access from another machine (optional)
- Make sure you expose the ports correctly (
8000
,8501
). - Use the server machine's IP instead of
localhost
insecrets.toml
. - You can also set up custom Docker networks for cross-host access.
📦 For production
You can run only the backend if you want to integrate it with another interface:
✨ Example of use
In the web interface, you can type questions like:
¿Qué día es hoy?
¿Dónde estoy?
¿Qué clima hace?
Explícame qué es Python
The app will decide whether to answer the question directly or delegate it to an agent.
🛠️ Agents available
Agent | Function |
---|---|
DATE | Returns the current date and time |
LOCATION | Detects the city and country by IP |
CLIMATE | Returns the weather at the current location |
LLM_EXPERT | Query the deepseek-r1:7b model via ollama |
📄 License
This project is licensed under the MIT License.
🙋♂️ Author
Powered by [Your Name or Alias].
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
A server that routes user questions to specialized agents (date, location, weather) or an LLM expert, with a simple Streamlit web interface for easy interaction.
Related MCP Servers
- AsecurityFlicenseAqualityA server that enables browser automation using Playwright, allowing interaction with web pages, capturing screenshots, and executing JavaScript in a browser environment through LLMs.Last updated -1218,1151TypeScript
- AsecurityAlicenseAqualityA server that leverages Cloudflare Browser Rendering to extract and process web content for use as context in LLMs, offering tools for fetching pages, searching documentation, extracting structured content, and summarizing content.Last updated -43TypeScriptMIT License
- AsecurityAlicenseAqualityA server that exposes PagerDuty API functionality to LLMs with structured inputs and outputs, enabling management of incidents, services, teams, and users.Last updated -14PythonMIT License
- -securityFlicense-qualityA server that manages conversation context for LLM interactions, storing recent prompts and providing relevant context for each user via REST API endpoints.Last updated -1,123TypeScript