Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
REPO_PATHNoPath to your repositorycurrent directory
GEMINI_API_KEYYesYour Gemini API key
YELLHORN_MCP_MODELNoGemini model to usegemini-2.5-pro-exp-03-25

Tools

Functions exposed to the LLM to take actions

NameDescription
create_workplan

Creates a GitHub issue with a detailed implementation plan.

This tool will:

  1. Create a GitHub issue immediately with the provided title and description

  2. Launch a background AI process to generate a comprehensive workplan

  3. Update the issue with the generated workplan once complete

The AI will analyze your entire codebase (respecting .gitignore) to create a detailed plan with:

  • Specific files to modify/create

  • Code snippets and examples

  • Step-by-step implementation instructions

  • Testing strategies

Codebase reasoning modes:

  • "full": Complete file contents (most comprehensive)

  • "lsp": Function signatures and docstrings only (lighter weight)

  • "file_structure": Directory tree only (fastest)

  • "none": No codebase context

Returns the created issue URL and number immediately.

get_workplan

Retrieves the workplan content (GitHub issue body) for a specified issue number.

revise_workplan

Updates an existing workplan based on revision instructions.

This tool will:

  1. Fetch the existing workplan from the specified GitHub issue

  2. Launch a background AI process to revise the workplan based on your instructions

  3. Update the issue with the revised workplan once complete

The AI will use the same codebase analysis mode and model as the original workplan.

Returns the issue URL and number immediately.

curate_context

Analyzes the codebase and creates a .yellhorncontext file listing directories to be included in AI context.

This tool helps optimize AI context by:

  1. Analyzing your codebase structure

  2. Understanding the task you want to accomplish

  3. Creating a .yellhorncontext file that lists relevant directories

  4. Subsequent workplan/judgement calls will only include files from these directories

The .yellhorncontext file acts as a whitelist - only files matching the patterns will be included. This significantly reduces token usage and improves AI focus on relevant code.

Example .yellhorncontext: src/api/ src/models/ tests/api/ *.config.js

judge_workplan

Triggers an asynchronous code judgement comparing two git refs against a workplan.

This tool will:

  1. Create a sub-issue linked to the workplan immediately

  2. Launch a background AI process to analyze the code changes

  3. Update the sub-issue with the judgement once complete

The judgement will evaluate:

  • Whether the implementation follows the workplan

  • Code quality and completeness

  • Missing or incomplete items

  • Suggestions for improvement

Supports comparing:

  • Branches (e.g., feature-branch vs main)

  • Commits (e.g., abc123 vs def456)

  • PR changes (automatically uses PR's base and head)

Returns the sub-issue URL immediately.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/msnidal/yellhorn-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server