Integrates with Gemini 2.0 Pro for extended reasoning using thinking mode and large context analysis with up to 1M tokens.
Provides access to local Ollama models for AI consultations, code review, and development assistance without requiring external API keys.
Allows access to OpenAI models for enhanced development capabilities, including code review, debugging, and large context analysis with up to 200K tokens.
Zen MCP Server NPX Wrapper
Easy-to-use NPX wrapper for Zen MCP Server - Give Claude access to multiple AI models (Gemini, OpenAI, OpenRouter, Ollama) for enhanced development capabilities.
Quick Start
That's it! No Docker required. 🎉
What is Zen MCP Server?
Zen MCP Server gives Claude Desktop access to multiple AI models for:
- 🧠 Extended reasoning with Gemini 2.0 Pro's thinking mode
- 💬 Collaborative development with multiple AI perspectives
- 🔍 Code review and architectural analysis
- 🐛 Advanced debugging with specialized models
- 📊 Large context analysis (Gemini: 1M tokens, O3: 200K tokens)
- 🔄 Conversation threading - AI models maintain context across multiple calls
Features
- ✅ No Docker required - Runs directly with Python
- 🚀 Fast startup - No container overhead
- 💾 Lightweight - Minimal resource usage
- 🔧 Auto-setup - Handles Python dependencies automatically
- 📦 Virtual environment - Isolated dependencies
- 🌍 Cross-platform - Works on macOS, Windows, Linux
First Time Setup
On first run, the wrapper will:
- Check Python 3.11+ is installed
- Clone Zen MCP Server to
~/.zen-mcp-server
- Create
.env
file and prompt for API keys - Set up Python virtual environment
- Install dependencies automatically
Configuration
1. Get API Keys (at least one required)
Choose one or more:
- Gemini: Google AI Studio
- OpenAI: OpenAI Platform
- OpenRouter: OpenRouter (access to 100+ models)
- Local Models: Ollama, vLLM, LM Studio (no API key needed)
2. Configure API Keys
Edit ~/.zen-mcp-server/.env
:
Usage with Claude Desktop
Add to your claude_desktop_config.json
:
Location of config file:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
- Windows:
%APPDATA%\Claude\claude_desktop_config.json
- Linux:
~/.config/Claude/claude_desktop_config.json
Usage with Claude CLI
Available Tools
Once configured, Claude will have access to these tools:
zen
- Default tool for quick AI consultation (alias for chat)chat
- Collaborative development discussionsthinkdeep
- Extended reasoning (Gemini 2.0 Pro)codereview
- Professional code reviewprecommit
- Pre-commit validationdebug
- Advanced debugging assistanceanalyze
- Smart file and codebase analysis
Quick Usage: Just say "use zen" for quick AI consultations!
Troubleshooting
Python not found?
- macOS:
brew install python@3.11
- Windows: Download from python.org
- Linux:
sudo apt install python3.11
Dependencies issue?
The wrapper tries to install automatically, but if it fails:
API key issues?
- Check
~/.zen-mcp-server/.env
has valid keys - Ensure at least one API key is configured
- For OpenRouter, check your credits/limits
Requirements
- Python 3.11+
- Node.js >= 14.0.0
- Git
- At least one API key (Gemini, OpenAI, or OpenRouter)
Why No Docker?
We removed Docker because:
- Faster startup - No container overhead
- Less resource usage - No Redis, no Docker daemon
- Simpler - Just Python and your API keys
- Same features - Conversation threading works perfectly with in-memory storage
Links
License
Apache 2.0 - See LICENSE
This server cannot be installed
Gives Claude access to multiple AI models (Gemini, OpenAI, OpenRouter, Ollama) for enhanced development capabilities including extended reasoning, collaborative development, code review, and advanced debugging.
Related MCP Servers
- -securityFlicense-qualityA multi-model research agent platform supporting Claude, Gemini, and OpenAI models with web search capabilities, thinking-enabled features, and citation support for advanced research workflows.Last updated -6TypeScript
- -securityAlicense-qualityConnects Claude Code with multiple AI models (Gemini, Grok-3, ChatGPT, DeepSeek) simultaneously, allowing users to get diverse AI perspectives, conduct AI debates, and leverage each model's unique strengths.Last updated -7PythonMIT License
- -securityFlicense-qualityA Model Context Protocol server that gives Claude access to multiple AI models (Gemini, OpenAI, OpenRouter) for enhanced code analysis, problem-solving, and collaborative development through AI orchestration with conversations that continue across tasks.Last updated -2,354Python
- AsecurityAlicenseAqualityPairs Claude Code with Google's Gemini AI for complementary code analysis, enabling intelligent routing where Claude handles local-context operations while Gemini leverages its 1M token context for distributed system debugging and long-trace analysis.Last updated -1037TypeScriptMIT License