Skip to main content
Glama

MCP-Maestro

Every great research operation needs a conductor.

This MCP server turns Maestro into a tool your AI assistant can actually direct. Think of it as the bridge between "write me a report" and having a whole research orchestra play in harmony.

What is Maestro?

Maestro is an AI research framework with serious infrastructure. While others send one agent to do one thing, Maestro coordinates multiple specialized agents — planning, research, writing, reflection — all working together to produce properly structured, multi-section research output.

The backend runs an agentic layer on top of multiple LLM calls, manages research cycles, and maintains a proper document pipeline with embeddings and reranking. It's serious research infrastructure.

What does this MCP server do?

It exposes Maestro's mission management system through MCP. You can:

  • Fire off missions and let Maestro's agents do the heavy lifting

  • Track progress in real-time as sections get researched

  • Pause, resume, or stop research mid-flight

  • Pull reports once the orchestra finishes playing

The Full Suite of Tools

Tool

What it does

create_mission

Launch a new research mission

get_report

Pull the research report when done

get_notes

Get all research notes collected

resume

Continue a paused mission

stop

Cancel a running mission

Getting Started

1. Get Maestro Conducting

# Docker compose is the easiest path
git clone https://github.com/Dianachong/maestro.git
cd maestro/docker
docker compose up

This spins up the backend API plus PostgreSQL with pgvector for embeddings.

For more complex setups, check the official deployment docs.

2. Set Up This Server

git clone https://github.com/Dianachong/mcp-maestro.git
cd mcp-maestro
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt

3. Point It At Maestro

cp .env.example .env

Set MAESTRO_BASE_URL to your Maestro API endpoint.

Connecting to Your AI

Claude Desktop

In ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "maestro": {
      "command": "python",
      "args": ["/path/to/mcp-maestro/server.py"],
      "env": {
        "MAESTRO_BASE_URL": "http://localhost:YOUR_PORT"
      }
    }
  }
}

OpenCode / Cursor

Check your IDE's docs for MCP server configuration. The server runs as a long-lived HTTP process.

How It Works

The Mission Lifecycle

create_mission → running → [pause] → completed
                      ↘ [stop] → cancelled
                      ↘ [resume] → running
  1. Create with your research request

  2. Track status as agents do their thing

  3. Pull the report when it completes

Example Flow

You: Create a research mission about advances in solid-state batteries
CLI: Mission created with ID: mission-abc123

You: Check status of mission abc123
CLI: Status: running, Section 2/5 complete

You: Get research notes for mission abc123
CLI: [Array of research notes from agents]

You: Get report for mission abc123
CLI: [Full multi-section research report]

What's Inside a Mission

  • Planning Agent: Breaks down the research into sections

  • Research Agents: Hunt for information on each section

  • Writing Agent: Synthesizes findings into prose

  • Reflection Agent: Reviews and suggests improvements

  • Note Assignment: Tracks all sources and findings

Environment Variables

Variable

Default

Description

MAESTRO_BASE_URL

(required)

Where Maestro's API lives

LOG_LEVEL

INFO

DEBUG for noisy logs

Troubleshooting

Mission won't start

  • Is Maestro's API responding? Check MAESTRO_BASE_URL in .env

  • Check Maestro's logs for what went wrong

Mission stuck

  • Use stop to cancel, then create_mission with refined query

Connection refused

  • Firewall? Port conflict? Docker not running?

  • Try docker ps to confirm Maestro is up

Contributing

Issues welcome. If you find a bug, include the mission ID if applicable.

License

MIT

-
security - not tested
A
license - permissive license
-
quality - not tested

Resources

Unclaimed servers have limited discoverability.

Looking for Admin?

If you are the server author, to access and configure the admin panel.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Ruben-Alvarez-Dev/MCP-maestro'

If you have feedback or need assistance with the MCP directory API, please join our Discord server