Skip to main content
Glama

Reviewer MCP

by jaggederest
MIT License

Reviewer MCP

An MCP (Model Context Protocol) service that provides AI-powered development workflow tools. It supports multiple AI providers (OpenAI and Ollama) and offers standardized tools for specification generation, code review, and project management.

Features

  • Specification Generation: Create detailed technical specifications from prompts
  • Specification Review: Review specifications for completeness and provide critical feedback
  • Code Review: Analyze code changes with focus on security, performance, style, or logic
  • Test Runner: Execute tests with LLM-friendly formatted output
  • Linter: Run linters with structured output formatting
  • Pluggable AI Providers: Support for both OpenAI and Ollama (local models)

Installation

npm install npm run build

Configuration

Environment Variables

Create a .env file based on .env.example:

# AI Provider Configuration AI_PROVIDER=openai # Options: openai, ollama # OpenAI Configuration OPENAI_API_KEY=your_api_key_here OPENAI_MODEL=o1-preview # Ollama Configuration (for local models) OLLAMA_BASE_URL=http://localhost:11434 OLLAMA_MODEL=llama2

Project Configuration

Create a .reviewer.json file in your project root to customize commands:

{ "testCommand": "npm test", "lintCommand": "npm run lint", "buildCommand": "npm run build", "aiProvider": "ollama", "ollamaModel": "codellama" }

Using with Claude Desktop

Add the following to your Claude Desktop configuration:

{ "mcpServers": { "reviewer": { "command": "node", "args": ["/path/to/reviewer-mcp/dist/index.js"], "env": { "OPENAI_API_KEY": "your-api-key-here" } } } }

Using with Ollama

  1. Install Ollama: https://ollama.ai
  2. Pull a model: ollama pull llama2 or ollama pull codellama
  3. Set AI_PROVIDER=ollama in your .env file
  4. The service will use your local Ollama instance

Available Tools

generate_spec

Generate a technical specification document.

Parameters:

  • prompt (required): Description of what specification to generate
  • context (optional): Additional context or requirements
  • format (optional): Output format - "markdown" or "structured"

review_spec

Review a specification for completeness and provide critical feedback.

Parameters:

  • spec (required): The specification document to review
  • focusAreas (optional): Array of specific areas to focus the review on

review_code

Review code changes and provide feedback.

Parameters:

  • diff (required): Git diff or code changes to review
  • context (optional): Context about the changes
  • reviewType (optional): Type of review - "security", "performance", "style", "logic", or "all"

run_tests

Run standardized tests for the project.

Parameters:

  • testCommand (optional): Test command to run (defaults to configured command)
  • pattern (optional): Test file pattern to match
  • watch (optional): Run tests in watch mode

run_linter

Run standardized linter for the project.

Parameters:

  • lintCommand (optional): Lint command to run (defaults to configured command)
  • fix (optional): Attempt to fix issues automatically
  • files (optional): Array of specific files to lint

Development

# Run in development mode npm run dev # Run tests npm test # Run unit tests only npm run test:unit # Run integration tests (requires Ollama) npm run test:integration # Type checking npm run typecheck # Linting npm run lint

End-to-End Testing

The project includes a comprehensive e2e test that validates the full workflow using a real Ollama instance:

  1. Install and start Ollama: https://ollama.ai
  2. Pull a model: ollama pull llama2
  3. Run the test: npm run test:e2e

The e2e test demonstrates:

  • Specification generation
  • Specification review
  • Code creation
  • Code review
  • Linting
  • Test execution

All using real AI responses from your local Ollama instance.

License

MIT

-
security - not tested
A
license - permissive license
-
quality - not tested

An MCP service that provides AI-powered development workflow tools including specification generation, code review, and project management with support for both OpenAI and Ollama models.

  1. Features
    1. Installation
      1. Configuration
        1. Environment Variables
        2. Project Configuration
      2. Using with Claude Desktop
        1. Using with Ollama
          1. Available Tools
            1. generate_spec
            2. review_spec
            3. review_code
            4. run_tests
            5. run_linter
          2. Development
            1. End-to-End Testing
          3. License

            Related MCP Servers

            • -
              security
              -
              license
              -
              quality
              An MCP server that automatically generates documentation, test plans, and code reviews for code repositories by analyzing directory structures and code files using AI models via OpenRouter API.
              Last updated -
              3
              TypeScript
              Creative Commons Zero v1.0 Universal
            • A
              security
              F
              license
              A
              quality
              An MCP server that supercharges AI assistants with powerful tools for software development, enabling research, planning, code generation, and project scaffolding through natural language interaction.
              Last updated -
              11
              48
              TypeScript
              • Linux
              • Apple
            • -
              security
              F
              license
              -
              quality
              An OpenAI API-based MCP server that provides deep thinking and analysis capabilities, integrating with AI editor models to deliver comprehensive insights and practical solutions.
              Last updated -
            • A
              security
              A
              license
              A
              quality
              A MCP server that enables human-in-the-loop workflow in AI-assisted development tools by allowing users to run commands, view their output, and provide textual feedback directly to the AI assistant.
              Last updated -
              1
              453
              Python
              MIT License
              • Linux
              • Apple

            View all related MCP servers

            MCP directory API

            We provide all the information about MCP servers via our MCP API.

            curl -X GET 'https://glama.ai/api/mcp/v1/servers/jaggederest/mcp_reviewer'

            If you have feedback or need assistance with the MCP directory API, please join our Discord server