Uses Ollama for local LLM inference to power multi-agent orchestration, query analysis, and result synthesis. Enables the MCP server to analyze user queries, split them into agent-specific tasks, and synthesize responses from multiple agents using locally-run language models.
MCP Server with Multi-Agent Orchestration
A Model Context Protocol (MCP) server with multi-agent orchestration capabilities, featuring a simple web interface for querying agents. This system uses local Ollama for LLM inference and orchestrates multiple agents to process complex queries.
Features
MCP-Compliant: Implements Model Context Protocol standards
FastAPI Server: Modern async Python web framework
Multi-Agent Orchestration: Intelligent query splitting and result synthesis
Local LLM Support: Uses Ollama for local LLM inference
Web Interface: Simple Next.js frontend for querying the server
Automatic Agent Discovery: Agents are automatically discovered and registered
RESTful API: Standard HTTP endpoints for agent management
Quick Start
For detailed setup instructions, see
Prerequisites
Python 3.11+
Node.js 18+
Ollama installed and running
Model pulled:
ollama pull llama3:latest
Quick Installation
Access the frontend at http://localhost:3000
Architecture
Components
MCP Server (Python/FastAPI)
Orchestrates multi-agent workflows
Uses Ollama for LLM inference
Runs on port 8000
Frontend (Next.js/React)
Simple chat interface
Connects to MCP server
Runs on port 3000
Agents
Internal Agent: Simulates internal document retrieval
External Agent: Simulates external database queries
Orchestrator
Analyzes user queries using LLM
Splits queries into agent-specific tasks
Synthesizes results from multiple agents
Workflow
API Endpoints
MCP Server (Port 8000)
GET /health- Health checkPOST /orchestrate- Process user query{ "query": "your query here" }GET /mcp/agents- List all registered agentsGET /mcp/resources- List all MCP resourcesPOST /discover- Trigger agent discovery
Frontend (Port 3000)
GET /- Main chat interfacePOST /api/chat- Chat endpoint (forwards to MCP server)
Project Structure
Configuration
Create a .env file from env.example:
Documentation
SETUP.md - Comprehensive setup guide with step-by-step instructions
QUICKSTART.md - Quick start guide (if exists)
Development
Running Tests
Viewing Logs
MCP server logs are written to /tmp/mcp_server.log:
Helper Scripts
./start_server.sh- Start MCP server with log viewing./view_logs.sh- View MCP server logs
Troubleshooting
See SETUP.md for detailed troubleshooting guide.
Common issues:
Ollama not running: Start Ollama and verify with
curl http://localhost:11434/api/tagsPort conflicts: Kill processes on ports 8000 or 3000
Module not found: Ensure virtual environment is activated and dependencies installed
License
[Add your license information here]
Contributing
Create a feature branch
Make your changes
Add tests
Submit a pull request
This server cannot be installed