Skip to main content
Glama

MCP: Multi-Agent Control Point

by Gomezzz299

🧠 MCP: Multi-Agent Control Point

This project implements a multi-agent server that routes user queries to an LLM model or specialized agents (such as date, location, weather, or a technical expert). It includes a simple web interface built with Streamlit for ease of use.


🚀 Features

  • 🌐 Backend with FastAPI

  • 🧠 Specialized agents (date, location, weather, LLM expert)

  • 🧩 Extensible and modular agent system with inheritance

  • ⚙️ Common inheritance AgenteBase for uniform error and response handling

  • 🤖 Smart logic for agents to collaborate with each other

  • 🖥️ Visual interface with Streamlit (GUI)

  • 🐳 Docker containers for easy deployment

  • 🔌 Client-server communication ready for local or remote network


Related MCP server: Brightsy MCP Server

📁 Project structure

MCP/ ├── core/ │ ├── ollama_wrapper.py # Encapsula la lógica para interactuar con modelos LLM en Ollama │ ├── context_loader.py # Carga contexto adicional desde base de datos u otras fuentes │ └── router_llm.py # Router inteligente que decide qué agente usar en base a la consulta ├── agents/ # Carpeta que contiene todos los agentes disponibles del sistema ├── server/ │ ├── mcp_server.py # Punto central que gestiona los agentes registrados y el procesamiento de mensajes │ └── api.py # Define la API REST usando FastAPI para comunicación con la GUI u otros clientes ├── gui/ │ ├── app.py # Aplicación Streamlit que actúa como interfaz gráfica del sistema │ └── .streamlit/ │ └── secrets.toml # Archivo de configuración que contiene la URL del backend para la GUI ├── utils/ │ ├── db_utils.py # Funciones auxiliares para conectarse y consultar la base de datos SQLite │ ├── agente_base.py # Clase base AgenteBase, común a todos los agentes personalizados │ └── json_parser.py # Utilidad para dividir respuestas JSON en partes más manejables ├── database/ │ ├── context.db # Base de datos SQLite con información contextual para los agentes o el LLM │ ├── comprobar_db.py # Script que valida la existencia y consistencia de la base de datos │ └── create_db.py # Script para generar y poblar la base de datos desde cero ├── config.py # Archivo central de configuración del sistema (rutas, modelos, flags, etc.) ├── requirements.txt # Lista de dependencias de Python necesarias para ejecutar el proyecto ├── Dockerfile.backend # Dockerfile para construir el contenedor del backend (API + lógica de agentes) ├── Dockerfile.frontend # Dockerfile para construir el contenedor de la interfaz Streamlit └── docker-compose.yml # Archivo para levantar los servicios frontend y backend de forma conjunta

⚙️ Requirements


🧪 Quick installation

1. Clone the repository

git clone https://github.com/tu-usuario/MCP.git cd MCP

2. Create configuration file for Streamlit

Inside the gui directory, create the file:

gui/.streamlit/secrets.toml

With the following content:

server_url = "http://backend:8000/process"

3. Run with Docker Compose

docker-compose up --build

This will build and lift two containers:

  • Backend at http://localhost:8000

  • Graphical interface at http://localhost:8501


🌍 Access from another machine (optional)

  1. Make sure you expose the ports correctly ( 8000 , 8501 ).

  2. Use the server machine's IP instead of localhost in secrets.toml .

  3. You can also set up custom Docker networks for cross-host access.


📦 For production

You can run only the backend if you want to integrate it with another interface:

docker build -f Dockerfile.backend -t mcp_backend . docker run -p 8000:8000 mcp_backend

✨ Example of use

In the web interface, you can type questions like:

  • ¿Qué día es hoy?

  • ¿Dónde estoy?

  • ¿Qué clima hace?

  • Explícame qué es Python

The app will decide whether to answer the question directly or delegate it to an agent.


🛠️ Agents available

Agent

Function

DATE

Returns the current date and time

LOCATION

Detects the city and country by IP

CLIMATE

Returns the weather at the current location


🔄 Interaction between agents

The weather agent now directly uses the location agent to determine geographic coordinates ( lat , lon ) and city before querying the weather, allowing for responses tailored to the user's actual location. This improves modularity and collaboration between agents.


🧩 How to create a new agent

  1. Create a class that inherits from AgenteBase:

from agentes.base import AgenteBase class AgenteEjemplo(AgenteBase): patrones = [r"expresiones.*clave", r"otra.*forma.*de.*preguntar"] def agente(self) -> dict: datos = {"respuesta": "Soy un agente de ejemplo"} return {"success": True, "data": datos}
  1. Specifies patterns to detect relevant questions.

  2. Implements agente() which returns a dict with the key success and data or error.

  3. The agent will automatically use the indicated LLM to generate natural responses based on your data.


⚠️ Important technical notes

  • All agents inherit from AgenteBase, which manages:

    • Standard errors

    • Converting data to natural response via LLM

  • The agent() method must return a structured dictionary.

  • Each agent specifies which LLM model to use ( llm_simple or llm_experto ).

📄 License

This project is licensed under the MIT License.


🙋‍♂️ Author

Developed by Alejandro Gómez Sierra.

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Gomezzz299/MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server