Multi-Model Advisor
local-only server
The server can only run on the client’s local machine because it depends on local resources.
Integrations
Uses .ENV files for configuration, allowing customization of server settings, Ollama configuration, and system prompts for each model.
Includes support for Mermaid diagrams to visualize the workflow of how multiple AI models contribute to the advisory system.
Runs as a Node.js application with the server exposing tools for listing available models and querying multiple models simultaneously.
Multi-Model Advisor
(锵锵四人行)
A Model Context Protocol (MCP) server that queries multiple Ollama models and combines their responses, providing diverse AI perspectives on a single question. This creates a "council of advisors" approach where Claude can synthesize multiple viewpoints alongside its own to provide more comprehensive answers.
Features
- Query multiple Ollama models with a single question
- Assign different roles/personas to each model
- View all available Ollama models on your system
- Customize system prompts for each model
- Configure via environment variables
- Integrate seamlessly with Claude for Desktop
Prerequisites
- Node.js 16.x or higher
- Ollama installed and running (see Ollama installation)
- Claude for Desktop (for the complete advisory experience)
Installation
Installing via Smithery
To install multi-ai-advisor-mcp for Claude Desktop automatically via Smithery:
Manual Installation
- Clone this repository:Copy
- Install dependencies:Copy
- Build the project:Copy
- Install required Ollama models:Copy
Configuration
Create a .env
file in the project root with your desired configuration:
Connect to Claude for Desktop
- Locate your Claude for Desktop configuration file:
- MacOS:
~/Library/Application Support/Claude/claude_desktop_config.json
- Windows:
%APPDATA%\Claude\claude_desktop_config.json
- MacOS:
- Edit the file to add the Multi-Model Advisor MCP server:
- Replace
/absolute/path/to/
with the actual path to your project directory - Restart Claude for Desktop
Usage
Once connected to Claude for Desktop, you can use the Multi-Model Advisor in several ways:
List Available Models
You can see all available models on your system:
This will display all installed Ollama models and indicate which ones are configured as defaults.
Basic Usage
Simply ask Claude to use the multi-model advisor:
Claude will query all default models and provide a synthesized response based on their different perspectives.
How It Works
- The MCP server exposes two tools:
list-available-models
: Shows all Ollama models on your systemquery-models
: Queries multiple models with a question
- When you ask Claude a question referring to the multi-model advisor:
- Claude decides to use the
query-models
tool - The server sends your question to multiple Ollama models
- Each model responds with its perspective
- Claude receives all responses and synthesizes a comprehensive answer
- Claude decides to use the
- Each model can have a different "persona" or role assigned, encouraging diverse perspectives.
Troubleshooting
Ollama Connection Issues
If the server can't connect to Ollama:
- Ensure Ollama is running (
ollama serve
) - Check that the OLLAMA_API_URL is correct in your .env file
- Try accessing http://localhost:11434 in your browser to verify Ollama is responding
Model Not Found
If a model is reported as unavailable:
- Check that you've pulled the model using
ollama pull <model-name>
- Verify the exact model name using
ollama list
- Use the
list-available-models
tool to see all available models
Claude Not Showing MCP Tools
If the tools don't appear in Claude:
- Ensure you've restarted Claude after updating the configuration
- Check the absolute path in claude_desktop_config.json is correct
- Look at Claude's logs for error messages
RAM is not enough
Some managers' AI models may have chosen larger models, but there is not enough memory to run them. You can try specifying a smaller model (see the Basic Usage) or upgrading the memory.
License
MIT License
For more details, please see the LICENSE file in this project repository
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
You must be authenticated.
An MCP server that queries multiple Ollama models and combines their responses, providing diverse AI perspectives on a single question for more comprehensive answers.
- (锵锵四人行)
- Features
- Prerequisites
- Installation
- Configuration
- Connect to Claude for Desktop
- Usage
- How It Works
- Troubleshooting
- License
- Contributing