Integrates Google's Gemini models to provide interactive development chat and systematic debugging assistance.
Supports local model execution through Ollama for AI-powered development chat and systematic debugging tools.
Provides access to OpenAI's GPT models for collaborative development chat and root cause analysis.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@GenZ MCP Serverdebug why my database connections are timing out intermittently"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
GenZ MCP Server: Streamlined AI Development Assistant
A streamlined Model Context Protocol (MCP) server providing essential AI-powered development tools for your favorite coding agent. GenZ MCP Server focuses on the core functionality you need most: intelligent chat and comprehensive debugging assistance.
What is GenZ MCP Server?
GenZ MCP Server is a slimmed-down version of the comprehensive Zen MCP Server, containing only the two most essential tools:
GenZ Chat (
genz_chat) - Interactive development chat and collaborative thinkingGenZ Debug (
genz_debug) - Systematic root cause analysis and debugging assistance
This focused approach provides:
✅ Faster startup and reduced complexity
✅ Essential AI-powered development assistance
✅ Full multi-model provider support
✅ Conversation threading and context preservation
✅ Clean, maintainable codebase
Attribution
GenZ MCP Server is a derivative work of the excellent Zen MCP Server by BeehiveInnovations, with much love and gratitude for their original concepts and innovative ideas. The foundational architecture, multi-provider support, and conversation threading were all pioneered in the original Zen MCP Server.
This fork was created to explore a more focused approach, removing tools that weren't being used in my personal workflow and concentrating on the essential chat and debugging functionality. The original Zen MCP Server remains the comprehensive solution for teams needing the full suite of AI-powered development tools.
Tools Overview
1. GenZ Chat (genz_chat)
Interactive development chat and collaborative thinking
Perfect for:
Bouncing ideas during analysis
Getting second opinions on technical decisions
Collaborative brainstorming sessions
Validating approaches and checklists
General development questions and explanations
Exploring alternatives and solutions
Features:
File context support for code discussions
Image support for UI/visual discussions
Conversation continuation across sessions
Web search integration for current information
Multiple AI model support for diverse perspectives
2. GenZ Debug (genz_debug)
Systematic root cause analysis and debugging assistance
Perfect for:
Complex bug investigation
Mysterious errors and failures
Performance issues analysis
Race conditions and timing problems
Memory leaks and resource issues
Integration and configuration problems
Features:
Step-by-step investigation workflow
Evidence-based hypothesis tracking
Confidence levels (exploring → certain)
Systematic file examination
Context-aware analysis
Expert model validation
Backtracking support for complex investigations
Quick Start
Prerequisites
Python 3.9+
An API key for at least one supported provider:
Gemini API key (
GEMINI_API_KEY)OpenAI API key (
OPENAI_API_KEY)OpenRouter API key (
OPENROUTER_API_KEY)Local Ollama setup (
CUSTOM_API_URL)Or other supported providers
Installation
Clone the repository:
git clone https://github.com/your-repo/genz-mcp-server cd genz-mcp-serverSet up your environment:
# Create and activate virtual environment python -m venv .genz_venv source .genz_venv/bin/activate # On Windows: .genz_venv\Scripts\activate # Install dependencies pip install -r requirements.txtConfigure API keys:
# Copy example and edit cp .env.example .env # Edit .env with your API keysAdd to Claude Desktop config:
{ "mcpServers": { "genz-mcp-server": { "command": "python", "args": ["/path/to/genz-mcp-server/server.py"], "env": { "GEMINI_API_KEY": "your_key_here", "OPENAI_API_KEY": "your_key_here" } } } }
Usage Examples
GenZ Chat Examples
Basic Development Discussion:
Code Review Discussion:
GenZ Debug Examples
Systematic Bug Investigation:
Performance Issue Analysis:
Model Support
GenZ MCP Server supports all the same AI providers as the full Zen MCP Server:
Gemini (Google AI)
OpenAI (GPT models)
Grok (X.AI)
OpenRouter (Multiple models)
DIAL (Enterprise)
Custom APIs (Ollama, vLLM, etc.)
Model Selection
Set
DEFAULT_MODEL=autofor automatic model selectionOr specify models explicitly:
DEFAULT_MODEL=gemini-2.0-flash-expOverride per-request:
model: gpt-4o-miniin tool calls
Development
Running Tests
Project Structure
Migration from Zen MCP Server
If you're coming from the full Zen MCP Server, GenZ provides the essential tools you use most:
Removed tools: analyze, codereview, consensus, planner, refactor, testgen, thinkdeep, tracer, precommit, secaudit, docgen, challenge, listmodels, version
Kept tools: chat → genz_chat, debug → genz_debug
Your existing workflows using chat and debug will continue to work with the new tool names.
Contributing
Fork the repository
Create a feature branch
Make changes with tests
Run quality checks:
./code_quality_checks.shSubmit a pull request
License
MIT License - see LICENSE file for details.
Support
For issues, questions, or feature requests:
Create an issue in the repository
Check existing documentation
Review the original Zen MCP Server docs for advanced concepts
GenZ MCP Server: Essential AI tools, maximum focus. 🚀