gemini-researcher
Gemini Researcher is a lightweight, stateless MCP server that offloads deep codebase analysis to Gemini CLI, reducing agent context usage while providing structured JSON insights into local projects.
Key Capabilities:
Quick code queries (
quick_query) - Fast analysis of specific files or code sections using Gemini's flash model with adjustable verbosity (concise/normal/detailed)Deep research (
deep_research) - Comprehensive multi-file analysis for architectural reviews, security audits, and cross-file investigations using Gemini's pro modelDirectory structure mapping (
analyze_directory) - Enumerate and understand project organization, file purposes, and module relationships with configurable depth and file limitsPath validation (
validate_paths) - Pre-verify file accessibility within the project root before running expensive queriesHealth diagnostics (
health_check) - Check server status, Gemini CLI installation, authentication, and configuration to troubleshoot issuesLarge response handling (
fetch_chunk) - Automatically chunks responses over ~10KB with 1-hour caching for continuation retrievalContext-aware analysis - All queries support focus areas (security, architecture, performance, general) to guide targeted insights
Token efficiency - Analyzes files directly from disk using
@pathreferences instead of consuming the agent's context windowRead-only safety - Stateless server that only reads files, never modifies them
Cross-client compatibility - Works with Claude Code, Cursor, VS Code (GitHub Copilot), and other MCP clients
Docker deployment - Run as a containerized service with project mounting and environment configuration
Delegates deep repository analysis and codebase research to Google's Gemini models via the Gemini CLI, enabling architectural reviews and multi-file analysis using large context windows.
Gemini Researcher
A lightweight, stateless MCP (Model Context Protocol) server that lets developer agents (Claude Code, GitHub Copilot) hand off deep repository analysis to the Gemini CLI. The server is read-only, returns structured JSON (as text content), and is designed to reduce the calling agent's context and model usage.
Status: v1 complete. Core features are stable, but still early days. Feedback welcome!
If this saved you tokens, ⭐ please consider giving it a star! :)
The primary goals:
Reduce agent context usage by letting Gemini CLI read large codebases locally and do its own research
Reduce calling-agent model usage by offloading heavy analysis to Gemini
Keep the server stateless and read-only for safety
Why use this?
Instead of copying entire files into your agent's context (burning tokens and cluttering the conversation), this server lets Gemini CLI read files directly from your project. Your agent sends a research query, Gemini reads and synthesizes using its large context window, and returns structured results. You save tokens, your agent stays focused, and complex codebase analysis becomes practical.
Verified clients: Claude Code, Cursor, VS Code (GitHub Copilot)
It definitely works with other clients, but I haven't personally tested them yet. Please open an issue if you try it elsewhere!
Table of contents
Overview
Gemini Researcher accepts queries from your AI agent and uses Gemini CLI to analyze your local code files. Results are returned as formatted JSON for your agent to use.
Runtime safety
The server runs Gemini CLI with safety restrictions enabled. See docs/runtime-contract.md for full technical details.
Default invocation pattern:
gemini [ -m <model> ] --output-format json --approval-mode default [--admin-policy <path>] -p "<prompt>"Key safety points:
Uses
--approval-mode default(not yolo mode) for controlled executionEnforces read-only policy by default to prevent file changes
Policy blocks mutating tools like
write_file,replace,run_shell_commandStrict enforcement can be disabled with
GEMINI_RESEARCHER_ENFORCE_ADMIN_POLICY=0(not recommended)
Auth and health check
Run health_check with includeDiagnostics: true to see auth status and server health.
authStatus | What it means | Impact |
| Gemini CLI is authenticated | Server ready to use |
| No valid authentication found | Server marked as degraded |
| Could not verify auth status | Server marked as degraded |
health_check.status values:
ok: Gemini CLI is available, auth is working, and safety policy is enforceddegraded: Setup incomplete, auth unclear, or safety policy disabled
Prerequisites
Node.js 18+ installed
Gemini CLI installed:
npm install -g @google/gemini-cliGemini CLI authenticated (recommended:
gemini→ Login with Google) or setGEMINI_API_KEY
Quick checks:
node --version
gemini --versionQuickstart
Step 1: Validate environment
Run the setup wizard to verify Gemini CLI is installed and authenticated:
npx gemini-researcher initStep 2: Configure your MCP client
Standard config works in most of the tools:
{
"mcpServers": {
"gemini-researcher": {
"command": "npx",
"args": [
"gemini-researcher"
]
}
}
}On native Windows, some MCP hosts use shell-less process spawning and may not resolve npm command shims reliably (npx, gemini).
If startup fails with launch errors (spawn ... ENOENT / GEMINI_CLI_LAUNCH_FAILED despite working in PowerShell), prefer Docker or WSL for immediate reliability.
See the full remediation tree in docs/platforms/windows.md.
Add to your VS Code MCP settings (create .vscode/mcp.json if needed):
{
"servers": {
"gemini-researcher": {
"command": "npx",
"args": [
"gemini-researcher"
]
}
}
}Option 1: Command line (recommended)
Local (user-wide) scope
# Add the MCP server via CLI
claude mcp add --transport stdio gemini-researcher -- npx gemini-researcher
# Verify it was added
claude mcp listProject scope
Navigate to your project directory, then run:
# Add the MCP server via CLI
claude mcp add --scope project --transport stdio gemini-researcher -- npx gemini-researcher
# Verify it was added
claude mcp listOption 2: Manual configuration
Add to .mcp.json in your project root (project scope):
{
"mcpServers": {
"gemini-researcher": {
"command": "npx",
"args": [
"gemini-researcher"
]
}
}
}Or add to ~/.claude/settings.json for local scope.
After adding the server, restart Claude Code and use /mcp to verify the connection.
Go to Cursor Settings -> Tools & MCP -> Add a Custom MCP Server. Add the following configuration:
{
"mcpServers": {
"gemini-researcher": {
"type": "stdio",
"command": "npx",
"args": [
"gemini-researcher"
]
}
}
}The server automatically uses the directory where the IDE opened your workspace as the project root or where your terminal is. To analyze a different directory, optionally setPROJECT_ROOT:
Example
{
"mcpServers": {
"gemini-researcher": {
"command": "npx",
"args": [
"gemini-researcher"
],
"env": {
"PROJECT_ROOT": "/path/to/your/project"
}
}
}
}Step 3: Restart your MCP client
Step 4: Test it
Ask your agent: "Use gemini-researcher to analyze the project."
Tools
All tools return structured JSON (as MCP text content). Large responses are chunked (~10KB per chunk) and cached for 1 hour.
Tool | Purpose | When to use |
| Fast analysis with flash model | Quick questions about specific files or small code sections |
| In-depth analysis with pro model | Complex multi-file analysis, architecture reviews, security audits |
| Map directory structure | Understanding unfamiliar codebases, generating project overviews |
| Pre-check file paths | Verify files exist before running expensive queries |
| Diagnostics | Troubleshooting server/Gemini CLI issues |
| Get chunked responses | Retrieve remaining parts of large responses |
Query tool fallback chains are family-aware:
quick_query:flash -> flash_lite -> autodeep_research:pro -> flash -> flash_lite -> autoanalyze_directory:flash -> flash_lite -> auto
When using API-key auth, fallback also handles model-unavailable/unsupported errors (not only quota/capacity errors).
Example workflows
Understanding a security vulnerability:
Agent: Use deep_research to analyze authentication flow across @src/auth and @src/middleware, focusing on securityQuick code explanation:
Agent: Use quick_query to explain the login flow in @src/auth.ts, be conciseMapping an unfamiliar codebase:
Agent: Use analyze_directory on src/ with depth 3 to understand the project structurequick_query
{
"prompt": "Explain @src/auth.ts login flow",
"focus": "security",
"responseStyle": "concise"
}deep_research
{
"prompt": "Analyze authentication across @src/auth and @src/middleware",
"focus": "architecture",
"citationMode": "paths_only"
}analyze_directory
{
"path": "src",
"depth": 3,
"maxFiles": 200
}validate_paths
{
"paths": ["src/auth.ts", "README.md"]
}health_check
{
"includeDiagnostics": true
}fetch_chunk
{
"cacheKey": "cache_abc123",
"chunkIndex": 2
}Docker
A pre-built multi-platform Docker image is available on Docker Hub:
# Pull the image (works on Intel/AMD and Apple Silicon)
docker pull capybearista/gemini-researcher:latest
# Run the server (mount your project and provide API key)
docker run -i --rm \
-e GEMINI_API_KEY="your-api-key" \
-v /path/to/your/project:/workspace \
capybearista/gemini-researcher:latestFor MCP client configuration with Docker:
{
"mcpServers": {
"gemini-researcher": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"-e", "GEMINI_API_KEY",
"-v", "/path/to/your/project:/workspace",
"capybearista/gemini-researcher:latest"
],
"env": {
"GEMINI_API_KEY": "your-api-key-here"
}
}
}
}The
-iflag is required for stdio transportThe container mounts your project to
/workspace(the project root)Replace
/path/to/your/projectwith your actual project pathReplace
your-api-keywith your actual Gemini API key (this is required for Docker usage)
Platform guides
Native Windows launch model and remediation:
docs/platforms/windows.md
Troubleshooting (common issues)
Remediation decision tree:
Error / signal | Run this check first | Change this configuration next |
|
| Prefer Docker or WSL config. If staying native, point host command to a stable shim/binary path and restart host. |
| Run | Update host config to launch the reported |
MCP host cannot launch server via |
| Change host server command from |
|
| Upgrade Gemini CLI to v0.36.0+ |
|
| Authenticate Gemini CLI or set |
GEMINI_CLI_NOT_FOUND: Install Gemini CLI:npm install -g @google/gemini-cliGEMINI_CLI_LAUNCH_FAILED: This is a launch-path issue, not an auth/capability issue. On Windows, command shims can fail in shell-less spawn contexts. Validategemini --helpandnpx --versioninteractively, then prefer Docker or WSL if host launch mode is strict.GEMINI_RESEARCHER_GEMINI_COMMAND: Override the Gemini command name/path used by the server (for wrappers or pinned binary locations).GEMINI_RESEARCHER_GEMINI_ARGS_PREFIX: Prefix extra Gemini args for every invocation (for example--config <file>).health_checkdiagnostics redact sensitive token-like values in configured args prefix output.AUTH_MISSING: Rungemini, and authenticate or setGEMINI_API_KEYAUTH_UNKNOWN: Auth could not be confirmed (often network/CLI probe failure). If launch errors are present, fix launch-path first; otherwise verifygeminiworks interactively, then retry.ADMIN_POLICY_MISSING: Reinstall package or verifypolicies/read-only-enforcement.tomlexists in installed package.ADMIN_POLICY_UNSUPPORTED: Upgrade Gemini CLI to v0.36.0+ (gemini --helpshould include--admin-policy).Capability errors (
ADMIN_POLICY_UNSUPPORTED, output format unsupported) should be interpreted only after a successfulgemini --helpprobe. If probe launch fails, treat it as launch-path failure first.GEMINI_RESEARCHER_ENFORCE_ADMIN_POLICY=0: Disables strict startup policy checks. This reduces safety guarantees..gitignoreblocking files: Gemini respects.gitignoreby default; togglefileFiltering.respectGitIgnoreingemini /settingsif you intentionally want ignored files included (note: this changes Gemini behavior globally)PATH_NOT_ALLOWED: All@pathreferences must resolve inside the configured project root (process.cwd()by default). Usevalidate_pathsto pre-check paths.QUOTA_EXCEEDED: Server retries with fallback models; if all options are exhausted, reduce scope (usequick_query) or wait for quota reset.
Contributing
Read the Contributing Guide to get started.
Quick links:
License
Maintenance
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/capyBearista/gemini-researcher-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server