Skip to main content
Glama

Code Review MCP Server

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
GEMINI_MODELNoOptional: Override the default Gemini modelgemini-1.5-pro
LLM_PROVIDERNoWhich LLM provider to use. Options: OPEN_AI, ANTHROPIC, or GEMINIOPEN_AI
OPENAI_MODELNoOptional: Override the default OpenAI modelgpt-4o
GEMINI_API_KEYNoYour Gemini API key
OPENAI_API_KEYNoYour OpenAI API key
ANTHROPIC_MODELNoOptional: Override the default Anthropic modelclaude-3-opus-20240307
ANTHROPIC_API_KEYNoYour Anthropic API key

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription
analyze_repo

Use this tool when you need to analyze a code repository structure without performing a detailed review. This tool flattens the repository into a textual representation and is ideal for getting a high-level overview of code organization, directory structure, and file contents. Use it before code_review when you need to understand the codebase structure first, or when a full code review is not needed.

code_review

Use this tool when you need a comprehensive code review with specific feedback on code quality, security issues, performance problems, and maintainability concerns. This tool performs in-depth analysis on a repository or specific files and returns structured results including issues found, their severity, recommendations for fixes, and overall strengths of the codebase. Use it when you need actionable insights to improve code quality or when evaluating a codebase for potential problems.

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/crazyrabbitLTC/mcp-code-review-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server