The Consult LLM MCP server enables Claude Code to consult more powerful AI models for complex problem analysis:
- Query powerful AI models: Access OpenAI's o3 (default), Google's Gemini 2.5 Pro, and DeepSeek Reasoner for specialized problem-solving.
- Context integration: Process markdown files as primary prompts while including other files as supplementary context.
- Direct prompting: Submit simple text questions or construct automatic prompts from markdown and code files.
- Git diff integration: Include code changes as additional context for more accurate analysis.
- Task specialization: Address specific needs like code implementation, review, bug analysis, and architecture advice.
- Usage tracking: Monitor interactions with cost estimation and comprehensive logging of prompts, responses, token usage, and parameters.
Provides ability to feed code changes through git diff as context for AI model queries
Enables querying Google's Gemini 2.5 Pro model with file context and automatically constructed prompts from markdown and code files
Supports automatic prompt construction from markdown files which become the main prompt when querying AI models
Enables querying OpenAI's o3 model with file context and automatically constructed prompts from markdown and code files
Consult LLM MCP
An MCP server that lets Claude Code consult stronger AI models (o3, Gemini 2.5 Pro, DeepSeek Reasoner) when you need deeper analysis on complex problems.
Features
- Query powerful AI models (o3, Gemini 2.5 Pro, DeepSeek Reasoner) with relevant files as context
- Direct queries with optional file context
- Include git changes for code review and analysis
- Comprehensive logging with cost estimation
Configuration
OPENAI_API_KEY
- Your OpenAI API key (required for o3)GEMINI_API_KEY
- Your Google AI API key (required for Gemini models)DEEPSEEK_API_KEY
- Your DeepSeek API key (required for DeepSeek models)CONSULT_LLM_DEFAULT_MODEL
- Override the default model (optional)- Options:
o3
(default),gemini-2.5-pro
,deepseek-reasoner
- Options:
Usage with Claude Code
Installation
Add the MCP server to Claude Code:
Or for global availability:
Optionally you can provide environment variables directly in the MCP configuration:
Example workflows
Click to expand.
MCP Tool: consult_llm
The server provides a single tool called consult_llm
for asking powerful AI
models complex questions.
Parameters
- prompt (required): Your question or request for the consultant LLM
- files (optional): Array of file paths to include as context
- All files are added as context with file paths and code blocks
- model (optional): LLM model to use
- Options:
o3
(default),gemini-2.5-pro
,deepseek-reasoner
- Options:
- git_diff (optional): Include git diff output as context
- files (required): Specific files to include in diff
- repo_path (optional): Path to git repository (defaults to current directory)
- base_ref (optional): Git reference to compare against (defaults to HEAD)
Example Usage
Basic prompt:
With file context:
With git diff:
Supported Models
- o3: OpenAI's reasoning model ($2/$8 per million tokens)
- gemini-2.5-pro: Google's Gemini 2.5 Pro ($1.25/$10 per million tokens)
- deepseek-reasoner: DeepSeek's reasoning model ($0.55/$2.19 per million tokens)
Logging
All prompts and responses are logged to ~/.consult-llm-mcp/logs/mcp.log
with:
- Tool call parameters
- Full prompts and responses
- Token usage and cost estimates
CLAUDE.md example
While not strictly necessary, to help Claude Code understand when and how to use
this tool, you can optionally something like the following to your project's
CLAUDE.md
file:
Claude Code seems to know pretty well when to use this MCP even without this instruction however.
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Tools
An MCP server that lets Claude Code consult stronger AI models (o3, Gemini 2.5 Pro, DeepSeek Reasoner) when you need deeper analysis on complex problems.
Related MCP Servers
- -securityFlicense-qualityA Model Context Protocol server that enables Claude users to access specialized OpenAI agents (web search, file search, computer actions) and a multi-agent orchestrator through the MCP protocol.Last updated -1Python
- -securityAlicense-qualityAn MCP server that implements Claude Code-like functionality, allowing the AI to analyze codebases, modify files, execute commands, and manage projects through direct file system interactions.Last updated -179PythonMIT License
- AsecurityFlicenseAqualityAn MCP server that connects Gemini 2.5 Pro to Claude Code, enabling users to generate detailed implementation plans based on their codebase and receive feedback on code changes.Last updated -23Python
- AsecurityAlicenseAqualityMCP server that provides Claude AI assistants with the ability to search the web, get news, and perform research using the You.com API.Last updated -4TypeScriptMIT License