Skip to main content
Glama

Deep Code Reasoning MCP Server

by haasonsaas
MIT License
19

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
GEMINI_API_KEYYesYour Google Gemini API key

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription
escalate_analysis

Hand off complex analysis to Gemini when Claude Code hits reasoning limits. Gemini will perform deep semantic analysis beyond syntactic patterns.

trace_execution_path

Use Gemini to perform deep execution analysis with semantic understanding

hypothesis_test

Use Gemini to test specific theories about code behavior

cross_system_impact

Use Gemini to analyze changes across service boundaries

performance_bottleneck

Use Gemini for deep performance analysis with execution modeling

start_conversation

Start a conversational analysis session between Claude and Gemini

continue_conversation

Continue an ongoing analysis conversation

finalize_conversation

Complete the conversation and get final analysis results

get_conversation_status

Check the status and progress of an ongoing conversation

run_hypothesis_tournament

Run a competitive hypothesis tournament to find root causes. Multiple AI conversations test different theories in parallel, with evidence-based scoring and elimination rounds.

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/haasonsaas/deep-code-reasoning-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server