Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| REPO_PATH | No | Path to your repository | current directory |
| GEMINI_API_KEY | Yes | Your Gemini API key | |
| YELLHORN_MCP_MODEL | No | Gemini model to use | gemini-2.5-pro-exp-03-25 |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
| create_workplan | Creates a GitHub issue with a detailed implementation plan. This tool will:
The AI will analyze your entire codebase (respecting .gitignore) to create a detailed plan with:
Codebase reasoning modes:
Returns the created issue URL and number immediately. |
| get_workplan | Retrieves the workplan content (GitHub issue body) for a specified issue number. |
| revise_workplan | Updates an existing workplan based on revision instructions. This tool will:
The AI will use the same codebase analysis mode and model as the original workplan. Returns the issue URL and number immediately. |
| curate_context | Analyzes the codebase and creates a .yellhorncontext file listing directories to be included in AI context. This tool helps optimize AI context by:
The .yellhorncontext file acts as a whitelist - only files matching the patterns will be included. This significantly reduces token usage and improves AI focus on relevant code. Example .yellhorncontext: src/api/ src/models/ tests/api/ *.config.js |
| judge_workplan | Triggers an asynchronous code judgement comparing two git refs against a workplan. This tool will:
The judgement will evaluate:
Supports comparing:
Returns the sub-issue URL immediately. |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |