Skip to main content
Glama

MCP: Multi-Agent Control Point

by Gomezzz299

🧠 MCP: Multi-Agent Control Point

This project implements a multi-agent server that routes user queries to an LLM model or specialized agents (such as date, location, weather, or a technical expert). It includes a simple web interface built with Streamlit for ease of use.


🚀 Features

  • 🌐 Backend with FastAPI
  • 🧠 Specialized agents (date, location, weather, LLM expert)
  • 🧩 Extensible and modular agent system with inheritance
  • ⚙️ Common inheritance AgenteBase for uniform error and response handling
  • 🤖 Smart logic for agents to collaborate with each other
  • 🖥️ Visual interface with Streamlit (GUI)
  • 🐳 Docker containers for easy deployment
  • 🔌 Client-server communication ready for local or remote network

📁 Project structure

MCP/ ├── core/ │ ├── ollama_wrapper.py # Encapsula la lógica para interactuar con modelos LLM en Ollama │ ├── context_loader.py # Carga contexto adicional desde base de datos u otras fuentes │ └── router_llm.py # Router inteligente que decide qué agente usar en base a la consulta ├── agents/ # Carpeta que contiene todos los agentes disponibles del sistema ├── server/ │ ├── mcp_server.py # Punto central que gestiona los agentes registrados y el procesamiento de mensajes │ └── api.py # Define la API REST usando FastAPI para comunicación con la GUI u otros clientes ├── gui/ │ ├── app.py # Aplicación Streamlit que actúa como interfaz gráfica del sistema │ └── .streamlit/ │ └── secrets.toml # Archivo de configuración que contiene la URL del backend para la GUI ├── utils/ │ ├── db_utils.py # Funciones auxiliares para conectarse y consultar la base de datos SQLite │ ├── agente_base.py # Clase base AgenteBase, común a todos los agentes personalizados │ └── json_parser.py # Utilidad para dividir respuestas JSON en partes más manejables ├── database/ │ ├── context.db # Base de datos SQLite con información contextual para los agentes o el LLM │ ├── comprobar_db.py # Script que valida la existencia y consistencia de la base de datos │ └── create_db.py # Script para generar y poblar la base de datos desde cero ├── config.py # Archivo central de configuración del sistema (rutas, modelos, flags, etc.) ├── requirements.txt # Lista de dependencias de Python necesarias para ejecutar el proyecto ├── Dockerfile.backend # Dockerfile para construir el contenedor del backend (API + lógica de agentes) ├── Dockerfile.frontend # Dockerfile para construir el contenedor de la interfaz Streamlit └── docker-compose.yml # Archivo para levantar los servicios frontend y backend de forma conjunta

⚙️ Requirements


🧪 Quick installation

1. Clone the repository

git clone https://github.com/tu-usuario/MCP.git cd MCP

2. Create configuration file for Streamlit

Inside the gui directory, create the file:

gui/.streamlit/secrets.toml

With the following content:

server_url = "http://backend:8000/process"

3. Run with Docker Compose

docker-compose up --build

This will build and lift two containers:

  • Backend at http://localhost:8000
  • Graphical interface at http://localhost:8501

🌍 Access from another machine (optional)

  1. Make sure you expose the ports correctly ( 8000 , 8501 ).
  2. Use the server machine's IP instead of localhost in secrets.toml .
  3. You can also set up custom Docker networks for cross-host access.

📦 For production

You can run only the backend if you want to integrate it with another interface:

docker build -f Dockerfile.backend -t mcp_backend . docker run -p 8000:8000 mcp_backend

✨ Example of use

In the web interface, you can type questions like:

  • ¿Qué día es hoy?
  • ¿Dónde estoy?
  • ¿Qué clima hace?
  • Explícame qué es Python

The app will decide whether to answer the question directly or delegate it to an agent.


🛠️ Agents available

AgentFunction
DATEReturns the current date and time
LOCATIONDetects the city and country by IP
CLIMATEReturns the weather at the current location

🔄 Interaction between agents

The weather agent now directly uses the location agent to determine geographic coordinates ( lat , lon ) and city before querying the weather, allowing for responses tailored to the user's actual location. This improves modularity and collaboration between agents.


🧩 How to create a new agent

  1. Create a class that inherits from AgenteBase:
from agentes.base import AgenteBase class AgenteEjemplo(AgenteBase): patrones = [r"expresiones.*clave", r"otra.*forma.*de.*preguntar"] def agente(self) -> dict: datos = {"respuesta": "Soy un agente de ejemplo"} return {"success": True, "data": datos}
  1. Specifies patterns to detect relevant questions.
  2. Implements agente() which returns a dict with the key success and data or error.
  3. The agent will automatically use the indicated LLM to generate natural responses based on your data.

⚠️ Important technical notes

  • All agents inherit from AgenteBase, which manages:
    • Standard errors
    • Converting data to natural response via LLM
  • The agent() method must return a structured dictionary.
  • Each agent specifies which LLM model to use ( llm_simple or llm_experto ).

📄 License

This project is licensed under the MIT License.


🙋‍♂️ Author

Developed by Alejandro Gómez Sierra.

-
security - not tested
F
license - not found
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

A server that routes user questions to specialized agents (date, location, weather) or an LLM expert, with a simple Streamlit web interface for easy interaction.

  1. 🚀 Features
    1. 📁 Project structure
      1. ⚙️ Requirements
        1. 🧪 Quick installation
          1. Clone the repository
          2. Create configuration file for Streamlit
          3. Run with Docker Compose
        2. 🌍 Access from another machine (optional)
          1. 📦 For production
            1. ✨ Example of use
              1. 🛠️ Agents available
                1. 🔄 Interaction between agents
                  1. 🧩 How to create a new agent
                    1. ⚠️ Important technical notes
                      1. 📄 License
                        1. 🙋‍♂️ Author

                          Related MCP Servers

                          • -
                            security
                            A
                            license
                            -
                            quality
                            This is a server that lets your LLMs (like Claude) talk directly to your BigQuery data! Think of it as a friendly translator that sits between your AI assistant and your database, making sure they can chat securely and efficiently.
                            Last updated -
                            1
                            241
                            81
                            JavaScript
                            MIT License
                          • A
                            security
                            F
                            license
                            A
                            quality
                            A server that implements the Model Context Protocol to connect LLMs to Brightsy AI agents, allowing users to pass messages to and receive responses from these agents.
                            Last updated -
                            1
                            96
                            JavaScript
                          • -
                            security
                            F
                            license
                            -
                            quality
                            A server that manages conversation context for LLM interactions, storing recent prompts and providing relevant context for each user via REST API endpoints.
                            Last updated -
                            839
                            TypeScript
                          • A
                            security
                            A
                            license
                            A
                            quality
                            An AI router that connects applications to multiple LLM providers (OpenAI, Anthropic, Google, DeepSeek, Ollama, etc.) with smart model orchestration capabilities, enabling dynamic switching between models for different reasoning tasks.
                            Last updated -
                            3
                            10
                            11
                            TypeScript
                            MIT License
                            • Linux
                            • Apple

                          View all related MCP servers

                          MCP directory API

                          We provide all the information about MCP servers via our MCP API.

                          curl -X GET 'https://glama.ai/api/mcp/v1/servers/Gomezzz299/MCP'

                          If you have feedback or need assistance with the MCP directory API, please join our Discord server