MCP: Multi-Agent Control Point

by Gomezzz299
Integrations
  • Enables containerized deployment of the MCP server with Docker and Docker Compose, allowing for easier setup and distribution across machines

  • Powers the backend API server that handles routing user questions to appropriate specialized agents

  • Integrates with Ollama to access the deepseek-r1:7b language model for expert responses when specialized agents cannot handle a query

🧠 MCP: Multi-Agent Control Point

This project implements a multi-agent server that routes user queries to an LLM model or specialized agents (such as date, location, weather, or a technical expert). It includes a simple web interface built with Streamlit for ease of use.


🚀 Features

  • 🌐 Backend with FastAPI
  • 🧠 Specialized agents (date, location, weather, LLM expert)
  • 🖥️ Visual interface with Streamlit (GUI)
  • 🐳 Docker containers for easy deployment
  • 🔌 Client-server communication ready for local or remote network

📁 Project structure

MCP/ ├── core/ │ ├── registry.py # Registra todos los agentes │ └── router_llm.py # Permite distribución entre agentes ├── agents/ │ └── agent.py # cada agente incluido en el servidor ├── server/ │ ├── mcp_server.py # Lógica del MCP │ └── api.py # Backend FastAPI ├── gui/ │ ├── app.py # Interfaz Streamlit │ └── .streamlit/ │ └── secrets.toml # Configuración del backend ├── utils/ │ └── json_parser.py # Función para dividir json ├── requirements.txt # Dependencias comunes ├── Dockerfile.backend # Imagen del backend ├── Dockerfile.frontend # Imagen del frontend └── docker-compose.yml # Orquestación de servicios

⚙️ Requirements


🧪 Quick installation

1. Clone the repository

git clone https://github.com/tu-usuario/MCP.git cd MCP

2. Create configuration file for Streamlit

Inside the gui directory, create the file:

gui/.streamlit/secrets.toml

With the following content:

server_url = "http://backend:8000/process"

3. Run with Docker Compose

docker-compose up --build

This will build and lift two containers:

  • Backend at http://localhost:8000
  • Graphical interface at http://localhost:8501

🌍 Access from another machine (optional)

  1. Make sure you expose the ports correctly ( 8000 , 8501 ).
  2. Use the server machine's IP instead of localhost in secrets.toml .
  3. You can also set up custom Docker networks for cross-host access.

📦 For production

You can run only the backend if you want to integrate it with another interface:

docker build -f Dockerfile.backend -t mcp_backend . docker run -p 8000:8000 mcp_backend

✨ Example of use

In the web interface, you can type questions like:

  • ¿Qué día es hoy?
  • ¿Dónde estoy?
  • ¿Qué clima hace?
  • Explícame qué es Python

The app will decide whether to answer the question directly or delegate it to an agent.


🛠️ Agents available

AgentFunction
DATEReturns the current date and time
LOCATIONDetects the city and country by IP
CLIMATEReturns the weather at the current location
LLM_EXPERTQuery the deepseek-r1:7b model via ollama

📄 License

This project is licensed under the MIT License.


🙋‍♂️ Author

Powered by [Your Name or Alias].

-
security - not tested
F
license - not found
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

A server that routes user questions to specialized agents (date, location, weather) or an LLM expert, with a simple Streamlit web interface for easy interaction.

  1. 🚀 Features
    1. 📁 Project structure
      1. ⚙️ Requirements
        1. 🧪 Quick installation
          1. Clone the repository
          2. Create configuration file for Streamlit
          3. Run with Docker Compose
        2. 🌍 Access from another machine (optional)
          1. 📦 For production
            1. ✨ Example of use
              1. 🛠️ Agents available
                1. 📄 License
                  1. 🙋‍♂️ Author

                    Related MCP Servers

                    • A
                      security
                      F
                      license
                      A
                      quality
                      A server that enables browser automation using Playwright, allowing interaction with web pages, capturing screenshots, and executing JavaScript in a browser environment through LLMs.
                      Last updated -
                      12
                      18,115
                      1
                      TypeScript
                    • A
                      security
                      A
                      license
                      A
                      quality
                      A server that leverages Cloudflare Browser Rendering to extract and process web content for use as context in LLMs, offering tools for fetching pages, searching documentation, extracting structured content, and summarizing content.
                      Last updated -
                      4
                      3
                      TypeScript
                      MIT License
                      • Apple
                    • A
                      security
                      A
                      license
                      A
                      quality
                      A server that exposes PagerDuty API functionality to LLMs with structured inputs and outputs, enabling management of incidents, services, teams, and users.
                      Last updated -
                      14
                      Python
                      MIT License
                      • Apple
                    • -
                      security
                      F
                      license
                      -
                      quality
                      A server that manages conversation context for LLM interactions, storing recent prompts and providing relevant context for each user via REST API endpoints.
                      Last updated -
                      1,123
                      TypeScript

                    View all related MCP servers

                    MCP directory API

                    We provide all the information about MCP servers via our MCP API.

                    curl -X GET 'https://glama.ai/api/mcp/v1/servers/Gomezzz299/MCP'

                    If you have feedback or need assistance with the MCP directory API, please join our Discord server