Utilizes Google's AI infrastructure, including Gemini and Vertex AI, to power automated code quality assessments and requirement validation.
Uses Google Cloud's Vertex AI platform to access high-performance models for code review workflows and security analysis.
Serves as the default AI provider for analyzing code for bugs, scoring quality, and validating alignment with requirements documents.
Integrated via Azure OpenAI to provide ChatGPT-powered code reviews, best practice suggestions, and unified diff generation for code improvements.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MCP Code Reviewerreview auth.py for security vulnerabilities and propose fixes"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
MCP Code Reviewer
An MCP (Model Context Protocol) server that provides AI-powered code review with human-in-the-loop confirmation. Built with FastMCP and LiteLLM for multi-provider AI support.
Features
AI-Powered Code Review: Analyze code for bugs, security vulnerabilities, best practices, and performance issues
Requirements Validation: Compare code against requirements documents to ensure alignment
Automated Fix Proposals: Generate code changes to address identified issues
Human Confirmation: CLI prompts or token-based approval workflow before applying changes
Multi-Provider Support: Use Gemini (via API - default), Claude (via Vertex AI), or ChatGPT (via Azure)
Safe File Operations: Automatic backups before modifications, file size limits, and path validation
Project Structure
Installation
1. Clone or navigate to the project directory
2. Create virtual environment
3. Activate virtual environment
Windows:
Mac/Linux:
4. Install dependencies
5. Set up environment variables
Copy .env.example to .env and configure:
Edit .env with your credentials:
Usage
Running the MCP Server
Running the Interactive Client
The project includes a simple interactive client to test the MCP server:
This will:
Connect to the MCP server automatically
Show an interactive menu with all available tools
Guide you through using each tool with prompts
Available Client Options:
List Available Tools
Read File
Review Code
Validate Against Requirements
Propose Changes
Full Review Workflow
Available Tools
1. read_file_tool
Read content from a local file.
2. review_code_tool
Perform AI-powered code quality review.
Returns:
Review summary
List of issues with severity and line numbers
Overall quality score (0-10)
3. validate_requirements_tool
Validate code against requirements document.
Returns:
Alignment score
Missing requirements
Extra functionality
Recommendations
4. propose_changes_tool
Generate proposed code changes to fix issues.
Returns:
Original content
Proposed content
Unified diff
Change summary
5. confirm_and_apply_tool
Show changes and apply after confirmation.
Confirmation Modes:
cli_prompt: Interactive CLI prompt (requires user input)return_for_approval: Returns approval token for later application
6. apply_approved_tool
Apply previously approved changes using token.
7. full_review_workflow
Complete workflow: review → validate → propose → confirm.
Example Workflows
Quick Code Review
Review and Auto-Fix
Manual Approval Flow
AI Provider Configuration
Claude (via Google Vertex AI)
Set up Google Cloud credentials and enable Vertex AI API.
ChatGPT (via Azure)
Create Azure OpenAI resource and deploy GPT-4 model.
Gemini (via API)
Get API key from Google AI Studio.
Safety Features
Automatic Backups: Files are backed up to
.backups/before modificationFile Size Limits: Prevents memory issues with large files (default 5MB)
Path Validation: Prevents directory traversal attacks
Human Confirmation: Changes require explicit approval
Development
Testing Individual Tools
Custom Prompts
Edit files in prompts/ directory to customize AI behavior:
code_review.txt: Code review criteriarequirements.txt: Requirements validation approachfix_proposal.txt: Fix generation instructions
Troubleshooting
LiteLLM Connection Issues
Ensure environment variables are correctly set for your chosen provider.
File Size Errors
Adjust MAX_FILE_SIZE_MB in .env file.
Backup Directory
Ensure .backups/ directory exists or set custom path in BACKUP_DIRECTORY.
Resources
License
MIT