Enables local LLMs to request peer review feedback from Google Gemini, providing real-time analysis of accuracy, completeness, clarity, and improvement suggestions for AI-generated responses.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@AI Peer Review MCP ServerCan you review and improve my explanation about quantum entanglement?"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
AI Peer Review MCP Server
Enhance your local LLM responses with real-time peer review from Google Gemini
A Model Context Protocol (MCP) server that enables local language models to request peer review feedback from Google Gemini, dramatically improving response quality through AI collaboration.
๐ Features
Real-time peer review from Google Gemini for any local LLM response
Manual trigger system - user controls when to request peer review
Detailed feedback analysis - accuracy, completeness, clarity, and improvement suggestions
Comprehensive logging - see exactly what feedback Gemini provides
Privacy-conscious - only shares content when explicitly requested
Free to use - leverages Google Gemini's free tier
Easy integration - works with any MCP-compatible local LLM setup
๐ฏ Use Cases
Fact-checking complex or technical responses
Quality improvement for educational content
Writing enhancement for creative tasks
Technical validation for coding explanations
Research assistance with multiple AI perspectives
๐ Prerequisites
Python 3.8+ installed on your system
LMStudio (or another MCP-compatible LLM client)
Google AI Studio account (free) for Gemini API access
Local LLM with tool calling support (e.g., Llama 3.1, Mistral, Qwen)
๐ Quick Start
1. Get Google Gemini API Key
Visit Google AI Studio
Sign in with your Google account
Click "Get API key" โ "Create API key in new project"
Copy your API key (starts with
AIza...)
2. Install the MCP Server
3. Review Server Files
requirements.txt:
server.py: (See full code in the repository)
4. Configure LMStudio or any other supported MCP Host (e.g Claude Desktop)
Add this configuration to your LMStudio MCP settings:
Finding MCP Settings in LMStudio:
Look for: Settings โ MCP Servers
Or: Tools & Integrations โ MCP Configuration
Or: Program button โ Edit MCP JSON
5. Test the Setup
Restart LMStudio after adding the MCP configuration
Start a new chat in LMStudio
Ask any question: "What is quantum computing?"
Request peer review: "Use the ai_peer_review tool to check and improve your answer"
๐ Usage Examples
Basic Usage
Technical Questions
Creative Tasks
๐ง Configuration Options
Environment Variables
GEMINI_API_KEY- Your Google Gemini API key (required)
Customization
You can modify the peer review prompt in server.py to focus on specific aspects:
๐ Monitoring and Logs
The server creates detailed logs in mcp-server.log:
Log Information Includes:
Tool calls from LMStudio
Requests sent to Gemini
Raw Gemini responses
Parsed feedback
Error details
๐ Troubleshooting
Common Issues
"Tool not available"
Verify MCP server configuration in LMStudio
Ensure your local model supports tool calling
Restart LMStudio after configuration changes
"GEMINI_API_KEY not found"
Check your
.envfile exists and has the correct keyVerify API key is valid in Google AI Studio
Ensure environment variable is properly set in LMStudio config
"Rate limit exceeded"
Google Gemini free tier has generous limits
Wait a moment and try again
Check Google AI Studio quota usage
"Model not found"
API model names change over time
Update
GEMINI_API_URLin server.js if neededCheck Google's latest API documentation
Debug Mode
Run the server manually to see detailed output. Make sure your virtual environment is active.
๐ Privacy and Security
Data sharing only on request - content is only sent to Gemini when explicitly triggered
No persistent storage - conversations are not stored or logged beyond current session
API key security - keep your Gemini API key private and secure
Local processing - MCP runs entirely on your machine
๐ง Limitations
Requires tool-calling models - basic instruction-following models won't work
Internet connection required - needs access to Google Gemini API
Rate limits - subject to Google Gemini API quotas (free tier is generous)
Language support - optimized for English, other languages may work but aren't tested
๐ฃ๏ธ Roadmap
Multi-provider support - Add Groq, DeepSeek, and other AI APIs
Smart routing - Automatic provider selection based on question type
Confidence thresholds - Auto-trigger peer review for uncertain responses
Custom review templates - Domain-specific review criteria
Usage analytics - Track improvement metrics and API usage
Batch processing - Review multiple responses at once
๐ค Contributing
We welcome contributions! Here's how to help:
Development Setup
Ways to Contribute
๐ Bug reports - Open issues for any problems you encounter
๐ก Feature requests - Suggest new capabilities or improvements
๐ Documentation - Improve setup guides, add examples
๐ง Code contributions - Submit pull requests for fixes or features
๐งช Testing - Try with different models and report compatibility
๐ Localization - Help support more languages
Contribution Guidelines
Fork the repository
Create a feature branch (
git checkout -b feature/amazing-feature)Make your changes with clear, descriptive commits
Add tests if applicable
Update documentation for any new features
Submit a pull request with a clear description
๐ License
This project is licensed under the MIT License - see the LICENSE file for details.
๐ Acknowledgments
Anthropic - For creating the Model Context Protocol standard
Google - For providing the Gemini API
LMStudio - For excellent MCP integration
Community contributors - Everyone who helps improve this project
๐ Support
Issues: GitHub Issues
Discussions: GitHub Discussions
๐ Star History
If this project helps you, please consider giving it a star on GitHub! โญ
Made with โค๏ธ for the AI community