Enables local LLMs to request peer review feedback from Google Gemini, providing real-time analysis of accuracy, completeness, clarity, and improvement suggestions for AI-generated responses.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@AI Peer Review MCP ServerCan you review and improve my explanation about quantum entanglement?"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
AI Peer Review MCP Server
Enhance your local LLM responses with real-time peer review from Google Gemini
A Model Context Protocol (MCP) server that enables local language models to request peer review feedback from Google Gemini, dramatically improving response quality through AI collaboration.
π Features
Real-time peer review from Google Gemini for any local LLM response
Manual trigger system - user controls when to request peer review
Detailed feedback analysis - accuracy, completeness, clarity, and improvement suggestions
Comprehensive logging - see exactly what feedback Gemini provides
Privacy-conscious - only shares content when explicitly requested
Free to use - leverages Google Gemini's free tier
Easy integration - works with any MCP-compatible local LLM setup
π― Use Cases
Fact-checking complex or technical responses
Quality improvement for educational content
Writing enhancement for creative tasks
Technical validation for coding explanations
Research assistance with multiple AI perspectives
π Prerequisites
Python 3.8+ installed on your system
LMStudio (or another MCP-compatible LLM client)
Google AI Studio account (free) for Gemini API access
Local LLM with tool calling support (e.g., Llama 3.1, Mistral, Qwen)
π Quick Start
1. Get Google Gemini API Key
Visit Google AI Studio
Sign in with your Google account
Click "Get API key" β "Create API key in new project"
Copy your API key (starts with
AIza...)
2. Install the MCP Server
# Clone or create project directory
git clone https://github.com/your-repo/ai-peer-review-mcp # Replace with the actual repo URL
cd ai-peer-review-mcp
# Create a virtual environment (recommended)
python3 -m venv venv
source venv/bin/activate # On Windows use `venv\Scripts\activate`
# Install dependencies
pip install -r requirements.txt
# Create environment file
cp .env.example .env
# Now, edit the .env file and add your API key:
# GEMINI_API_KEY=your_actual_api_key_here3. Review Server Files
requirements.txt:
requests
python-dotenvserver.py: (See full code in the repository)
4. Configure LMStudio or any other supported MCP Host (e.g Claude Desktop)
Add this configuration to your LMStudio MCP settings:
{
"mcpServers": {
"ai-peer-review": {
"command": "python",
"args": ["server.py"],
"cwd": "/path/to/your/ai-peer-review-mcp",
"env": {
"GEMINI_API_KEY": "your_actual_api_key_here"
}
}
}
}Finding MCP Settings in LMStudio:
Look for: Settings β MCP Servers
Or: Tools & Integrations β MCP Configuration
Or: Program button β Edit MCP JSON
5. Test the Setup
Restart LMStudio after adding the MCP configuration
Start a new chat in LMStudio
Ask any question: "What is quantum computing?"
Request peer review: "Use the ai_peer_review tool to check and improve your answer"
π Usage Examples
Basic Usage
User: What causes climate change?
LLM: [Provides initial response about greenhouse gases...]
User: Use AI Peer Review to verify and improve that answer
LLM: [Calls ai_peer_review tool, receives feedback, provides enhanced response]Technical Questions
User: Explain how neural networks work
LLM: [Initial technical explanation...]
User: Can you use ai_peer_review to make sure the explanation is accurate?
LLM: [Enhanced response with better technical details and examples]Creative Tasks
User: Write a short story about AI
LLM: [Initial creative writing...]
User: Use peer review to improve the story structure and clarity
LLM: [Improved story with better narrative flow and character development]π§ Configuration Options
Environment Variables
GEMINI_API_KEY- Your Google Gemini API key (required)
Customization
You can modify the peer review prompt in server.py to focus on specific aspects:
review_prompt = f"""PEER REVIEW REQUEST:
# Customize this section for your specific needs
# Examples:
# - Focus on technical accuracy for coding questions
# - Emphasize creativity for writing tasks
# - Prioritize safety for medical/legal topics
...
"""π Monitoring and Logs
The server creates detailed logs in mcp-server.log:
# Watch logs in real-time
tail -f mcp-server.log
# View recent activity
cat mcp-server.log | tail -50Log Information Includes:
Tool calls from LMStudio
Requests sent to Gemini
Raw Gemini responses
Parsed feedback
Error details
π Troubleshooting
Common Issues
"Tool not available"
Verify MCP server configuration in LMStudio
Ensure your local model supports tool calling
Restart LMStudio after configuration changes
"GEMINI_API_KEY not found"
Check your
.envfile exists and has the correct keyVerify API key is valid in Google AI Studio
Ensure environment variable is properly set in LMStudio config
"Rate limit exceeded"
Google Gemini free tier has generous limits
Wait a moment and try again
Check Google AI Studio quota usage
"Model not found"
API model names change over time
Update
GEMINI_API_URLin server.js if neededCheck Google's latest API documentation
Debug Mode
Run the server manually to see detailed output. Make sure your virtual environment is active.
export GEMINI_API_KEY=your_api_key_here
python server.pyπ Privacy and Security
Data sharing only on request - content is only sent to Gemini when explicitly triggered
No persistent storage - conversations are not stored or logged beyond current session
API key security - keep your Gemini API key private and secure
Local processing - MCP runs entirely on your machine
π§ Limitations
Requires tool-calling models - basic instruction-following models won't work
Internet connection required - needs access to Google Gemini API
Rate limits - subject to Google Gemini API quotas (free tier is generous)
Language support - optimized for English, other languages may work but aren't tested
π£οΈ Roadmap
Multi-provider support - Add Groq, DeepSeek, and other AI APIs
Smart routing - Automatic provider selection based on question type
Confidence thresholds - Auto-trigger peer review for uncertain responses
Custom review templates - Domain-specific review criteria
Usage analytics - Track improvement metrics and API usage
Batch processing - Review multiple responses at once
π€ Contributing
We welcome contributions! Here's how to help:
Development Setup
git clone https://github.com/your-repo/ai-peer-review-mcp # Replace with your repo URL
cd ai-peer-review-mcp
# Create and activate virtual environment
python3 -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt
# Set up your environment
cp .env.example .env
# --> Add your GEMINI_API_KEY to the .env file
echo "Development environment ready. Run with 'python server.py'"Ways to Contribute
π Bug reports - Open issues for any problems you encounter
π‘ Feature requests - Suggest new capabilities or improvements
π Documentation - Improve setup guides, add examples
π§ Code contributions - Submit pull requests for fixes or features
π§ͺ Testing - Try with different models and report compatibility
π Localization - Help support more languages
Contribution Guidelines
Fork the repository
Create a feature branch (
git checkout -b feature/amazing-feature)Make your changes with clear, descriptive commits
Add tests if applicable
Update documentation for any new features
Submit a pull request with a clear description
π License
This project is licensed under the MIT License - see the LICENSE file for details.
π Acknowledgments
Anthropic - For creating the Model Context Protocol standard
Google - For providing the Gemini API
LMStudio - For excellent MCP integration
Community contributors - Everyone who helps improve this project
π Support
Issues: GitHub Issues
Discussions: GitHub Discussions
π Star History
If this project helps you, please consider giving it a star on GitHub! β
Made with β€οΈ for the AI community
Resources
Looking for Admin?
Admins can modify the Dockerfile, update the server description, and track usage metrics. If you are the server author, to access the admin panel.