Integrations
Analyzes repository content to create context-aware work plans and evaluates code changes through diff analysis against planned implementations.
Allows installation of the Yellhorn MCP server package directly from the Python Package Index.
Supports testing of the MCP server during development.
Yellhorn MCP
A Model Context Protocol (MCP) server that exposes Gemini 2.5 Pro and OpenAI capabilities to Claude Code for software development tasks using your entire codebase in the prompt. This pattern is highly useful for defining work to be done by code assistants like Claude Code or other MCP compatible coding agents, and reviewing the results ensuring they meet the exactly specified original requirements.
Features
- Create Workplans: Creates detailed implementation plans based on a prompt and taking into consideration your entire codebase, posting them as GitHub issues and exposing them as MCP resources for your coding agent
- Judge Code Diffs: Provides a tool to evaluate git diffs against the original workplan with full codebase context and provides detailed feedback, ensuring the implementation does not deviate from the original requirements and providing guidance on what to change to do so
- Seamless GitHub Integration: Automatically creates labeled issues, posts judgement sub-issues with references to original workplan issues
- Context Control: Use
.yellhornignore
files to exclude specific files and directories from the AI context, similar to.gitignore
- MCP Resources: Exposes workplans as standard MCP resources for easy listing and retrieval
Installation
Configuration
The server requires the following environment variables:
GEMINI_API_KEY
: Your Gemini API key (required for Gemini models)OPENAI_API_KEY
: Your OpenAI API key (required for OpenAI models)REPO_PATH
: Path to your repository (defaults to current directory)YELLHORN_MCP_MODEL
: Model to use (defaults to "gemini-2.5-pro-preview-03-25"). Available options:- Gemini models: "gemini-2.5-pro-preview-03-25", "gemini-2.5-flash-preview-04-17"
- OpenAI models: "gpt-4o", "gpt-4o-mini", "o4-mini", "o3"
The server also requires the GitHub CLI (gh
) to be installed and authenticated.
Usage
Getting Started
VSCode/Cursor Setup
To configure Yellhorn MCP in VSCode or Cursor, create a .vscode/mcp.json
file at the root of your workspace with the following content:
Claude Code Setup
To configure Yellhorn MCP with Claude Code directly, add a root-level .mcp.json
file in your project with the following content:
Tools
create_workplan
Creates a GitHub issue with a detailed workplan based on the title and detailed description.
Input:
title
: Title for the GitHub issue (will be used as issue title and header)detailed_description
: Detailed description for the workplancodebase_reasoning
: (optional) Control whether AI enhancement is performed:"full"
: (default) Use AI to enhance the workplan with full codebase context"lsp"
: Use AI with lightweight codebase context (function/method signatures, class attributes and struct fields for Python and Go)"none"
: Skip AI enhancement, use the provided description as-is
debug
: (optional) If set totrue
, adds a comment to the issue with the full prompt used for generation
Output:
- JSON string containing:
issue_url
: URL to the created GitHub issueissue_number
: The GitHub issue number
get_workplan
Retrieves the workplan content (GitHub issue body) associated with a workplan.
Input:
issue_number
: The GitHub issue number for the workplan.
Output:
- The content of the workplan issue as a string
judge_workplan
Triggers an asynchronous code judgement comparing two git refs (branches or commits) against a workplan described in a GitHub issue. Creates a GitHub sub-issue with the judgement asynchronously after running (in the background).
Input:
issue_number
: The GitHub issue number for the workplan.base_ref
: Base Git ref (commit SHA, branch name, tag) for comparison. Defaults to 'main'.head_ref
: Head Git ref (commit SHA, branch name, tag) for comparison. Defaults to 'HEAD'.codebase_reasoning
: (optional) Control which codebase context is provided:"full"
: (default) Use full codebase context"lsp"
: Use lighter codebase context (only function signatures for Python and Go, plus full diff files)"none"
: Skip codebase context completely for fastest processing
debug
: (optional) If set totrue
, adds a comment to the sub-issue with the full prompt used for generation
Output:
- A confirmation message that the judgement task has been initiated
Resource Access
Yellhorn MCP also implements the standard MCP resource API to provide access to workplans:
list-resources
: Lists all workplans (GitHub issues with the yellhorn-mcp label)get-resource
: Retrieves the content of a specific workplan by issue number
These can be accessed via the standard MCP CLI commands:
Development
CI/CD
The project uses GitHub Actions for continuous integration and deployment:
- Testing: Runs automatically on pull requests and pushes to the main branch
- Linting with flake8
- Format checking with black
- Testing with pytest
- Publishing: Automatically publishes to PyPI when a version tag is pushed
- Tag must match the version in pyproject.toml (e.g., v0.2.2)
- Requires a PyPI API token stored as a GitHub repository secret (PYPI_API_TOKEN)
To release a new version:
- Update version in pyproject.toml and yellhorn_mcp/__init__.py
- Update CHANGELOG.md with the new changes
- Commit changes:
git commit -am "Bump version to X.Y.Z"
- Tag the commit:
git tag vX.Y.Z
- Push changes and tag:
git push && git push --tags
For a history of changes, see the Changelog.
For more detailed instructions, see the Usage Guide.
License
MIT
You must be authenticated.
local-only server
The server can only run on the client's local machine because it depends on local resources.
An MCP server that connects Gemini 2.5 Pro to Claude Code, enabling users to generate detailed implementation plans based on their codebase and receive feedback on code changes.
Related MCP Servers
- AsecurityFlicenseAqualityAn MCP server implementation that leverages Google's Gemini API to provide analytical problem-solving capabilities through sequential thinking steps without code generation.Last updated -114JavaScript
- -securityAlicense-qualityThe ultimate Gemini API interface for MCP hosts, intelligently selecting models for the task at hand—delivering optimal performance, minimal token cost, and seamless integration.Last updated -6TypeScriptMIT License
- -security-license-qualityAn MCP server implementation that allows using Google's Gemini AI models (specifically Gemini 1.5 Pro) through Claude or other MCP clients via the Model Context Protocol.Last updated -1JavaScript
- AsecurityAlicenseAqualityA dedicated server that wraps Google's Gemini AI models in a Model Context Protocol (MCP) interface, allowing other LLMs and MCP-compatible systems to access Gemini's capabilities like content generation, function calling, chat, and file handling through standardized tools.Last updated -163TypeScriptMIT License