Integrates with Windsurf IDE (from Codeium) to provide codebase analysis capabilities through the MCP protocol
Leverages Google Gemini's large context window to perform comprehensive code analysis, security audits, and codebase exploration
Runs as a Node.js-based MCP server to enable AI agents to analyze codebases across various programming languages
Gemini CLI Orchestrator MCP
A lightweight CLI tool and MCP server enabling AI agents to perform deep codebase analysis with Gemini's massive context window.
🚀 Getting Started
Step 1: Install Gemini CLI
Step 2: Install this tool
Step 3: Test it works
That's it! Authentication happens automatically on first use.
Two Ways to Use
🚀 MCP Server (Recommended for Agents)
Makes this tool available to any AI agent via Model Context Protocol
MCP Configuration by IDE
Claude Code CLI
Claude Desktop
Config file locations:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
- Windows:
%APPDATA%\Claude\claude_desktop_config.json
Cursor IDE
Config file: .cursor/mcp.json
(project) or ~/.cursor/mcp.json
(global)
Windsurf IDE
Config file: ~/.codeium/windsurf/mcp_config.json
📁 Quick Setup: Copy example configs from .ide-configs/
directory
Any agent can now use:
analyze_with_gemini("find security issues", "@src/auth/ @middleware/")
- Intelligent file selection guided by tool description
- Cross-file analysis with Gemini's massive context window
💻 Direct CLI (For Scripts/Power Users)
Ultra-simple direct usage
Quick Start
Features
✅ @ syntax file inclusion - @src/
@**/*.js
@package.json
✅ Semantic keywords - @authentication
@database
@config
(via .gemini-direct.json)
✅ 5 core templates - security, architecture, performance, quality, debug
✅ Direct Gemini calls - no MCP overhead
✅ Zero configuration - works immediately
✅ Single dependency - just glob
Examples
How It Works
The tool has two components:
- CLI Tool (
gemini-direct.mjs
): Aggregates files using @ syntax and sends to Gemini CLI - MCP Server (
mcp-server.mjs
): Makes the CLI tool available to AI agents via standard protocol
File patterns like @src/
expand to include multiple files in a single Gemini analysis request.
Requirements
- Node.js 18+
- Google Gemini CLI installed and authenticated (see setup below)
⚡ Quick Setup Check
Setup
1. Install Gemini CLI
2. Authenticate with Google (OAuth - FREE)
The Gemini CLI uses OAuth authentication. No explicit auth command needed - authentication happens automatically on first use.
First Run: If not authenticated, Gemini CLI will automatically open your browser for OAuth login.
What Gets Created:
How It Works:
- First time: Any
gemini
command opens browser for OAuth - Subsequent calls: Gemini CLI automatically uses stored tokens
- Token refresh: Happens automatically when needed
- Your tool: Inherits authentication from Gemini CLI
Cross-Platform Paths:
OS | Auth Directory |
---|---|
Linux/macOS | ~/.gemini/ |
Windows | %USERPROFILE%\.gemini\ |
Docker | Mount host ~/.gemini/ as volume |
Uses Google OAuth authentication (personal Google account).
3. Verify Authentication
4. Install and Test This Tool
Authentication Details
No Code Changes Needed - Your tool automatically inherits authentication because:
The Gemini CLI handles reading ~/.gemini/oauth_creds.json
automatically.
Authentication is handled by the Gemini CLI, so the tool inherits existing credentials automatically.
Troubleshooting
"Command not found: gemini"
"Authentication failed"
"GEMINI_CLI_PATH not found"
The tool automatically finds the Gemini CLI. If you have issues:
Templates
- security - OWASP-style security audit
- architecture - System design and patterns analysis
- performance - Bottleneck identification and optimization
- quality - Code quality and best practices review
- debug - Bug identification and troubleshooting
Semantic Keywords
Create a .gemini-direct.json
file in your project root to define semantic keywords that map to file patterns:
Usage:
Distribution
This tool is designed to be:
- Copied - 3 files, copy anywhere
- Shared - Send to colleagues, zero setup
- Embedded - Drop into any project
- Global -
npm install -g
for system-wide use
Perfect for getting real value from Gemini's massive context window without the complexity overhead.
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
A lightweight MCP server that enables AI agents to perform deep codebase analysis by leveraging Gemini's massive context window for cross-file analysis and intelligent file selection.
Related MCP Servers
- AsecurityFlicenseAqualityAn MCP server implementation that leverages Google's Gemini API to provide analytical problem-solving capabilities through sequential thinking steps without code generation.Last updated -114JavaScript
- -securityFlicense-qualityAn MCP server implementation that maximizes Gemini's 2M token context window with tools for efficient context management and caching across multiple AI client applications.Last updated -14TypeScript
- AsecurityAlicenseAqualityA dedicated server that wraps Google's Gemini AI models in a Model Context Protocol (MCP) interface, allowing other LLMs and MCP-compatible systems to access Gemini's capabilities like content generation, function calling, chat, and file handling through standardized tools.Last updated -1620TypeScriptMIT License
- -securityFlicense-qualityAn MCP server that analyzes Python codebases using AST, stores code elements in a vector database, and enables natural language queries about code structure and functionality using RAG with Google's Gemini models.Last updated -1Python