This server provides search and retrieval tools for Google Gemini API documentation through an MCP (Model Context Protocol) interface.
Available Tools:
search_documentation: Performs keyword-based, full-text searches across all Gemini documentation pages using short queries (1-3 keywords maximum, max 3 queries at once)get_capability_page: Retrieves the complete content of a specific documentation page by its exact title, or call without arguments to get a master list of all available page titlesget_current_model: Quickly accesses the dedicated "Gemini Models" documentation page with details about model variants (Pro, Flash, etc.), their capabilities, versioning, and context window sizes
Key Features:
Automatic documentation updates on server startup by scraping from
ai.google.devLocal SQLite database with FTS5 full-text search indexing for efficient querying and offline access
Supports Python and TypeScript SDK documentation
Provides tools for searching and retrieving Google Gemini API documentation, including full-text search across documentation pages, listing available capabilities, and accessing current model documentation.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Gemini Docs MCP Serversearch for how to use embeddings with Gemini"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Gemini Docs MCP Server
A remote HTTP MCP server that provides tools to search and retrieve Google Gemini API documentation. The server exposes the MCP protocol at the /mcp endpoint and can be deployed to Cloud Run or other containerized platforms. It also supports local stdio mode for development.
Search Documentation: Full-text search across all Gemini documentation pages.
Get Capabilities: List available documentation pages or retrieve content for a specific page.
Get Current Model: Quickly access documentation for current Gemini models.
Automatic Updates: Scrapes and updates documentation on server startup.
sequenceDiagram
participant Client as MCP Client / IDE
participant Server as FastMCP Server
participant DB as SQLite Database
Client->>Server: call_tool("search_documentation", queries=["embeddings"])
Server->>DB: Full-Text Search for "embeddings"
DB-->>Server: Return matching documentation
Server-->>Client: Return formatted resultsHow it Works
Ingestion: On startup, the server fetches
https://ai.google.dev/gemini-api/docs/llms.txtto get a list of all available documentation pages.Processing: It then concurrently fetches and processes each page, extracting the text content.
Indexing: The processed content is stored in a local SQLite database with a Full-Text Search (FTS5) index for efficient querying.
Searching: When you use the
search_documentationtool, the server queries this SQLite database to find the most relevant documentation pages.
Installation
Option 1: Use uvx (Recommended)
You can use uvx to run the server directly without explicit installation. This is the easiest way to get started.
uvx --from git+https://github.com/philschmid/gemini-api-docs-mcp gemini-docs-mcpOption 2: Install directly from GitHub
You can install the package directly from GitHub using pip:
pip install git+https://github.com/philschmid/gemini-api-docs-mcp.gitOption 3: Manual Installation (for development)
git clone https://github.com/philschmid/gemini-api-docs-mcp.git
cd gemini-api-docs-mcp
pip install -e .
cd ..
rm -rf gemini-api-docs-mcpUsage
Running as a Remote HTTP Server
The server runs as an HTTP server and exposes the MCP protocol at the /mcp endpoint. It respects the PORT environment variable (defaults to 8080).
# Set port (optional, defaults to 8080)
export PORT=8080
# Run the server
gemini-docs-mcpThe server will be accessible at http://localhost:8080/mcp (or your configured port).
Docker Deployment
Build and run the Docker container:
# Build the image
docker build -t gemini-docs-mcp .
# Run the container
docker run -p 8080:8080 gemini-docs-mcpCloud Run Deployment
Deploy to Google Cloud Run:
# Build and deploy
gcloud run deploy gemini-docs-mcp \
--source . \
--platform managed \
--region us-central1 \
--allow-unauthenticatedThe server will be accessible at https://<your-service-url>/mcp.
Running in Stdio Mode (Local)
If you don't set the PORT environment variable, the server runs in stdio mode for local MCP clients:
# Don't set PORT - runs in stdio mode
gemini-docs-mcpConfiguration
The database is stored at:
/tmp/gemini-api-docs/database.dbin containerized environments~/.mcp/gemini-api-docs/database.dbin local environments
You can override this by setting the GEMINI_DOCS_DB_PATH environment variable.
Using with an MCP Client
For remote HTTP servers, configure your MCP client to connect via HTTP:
{
"mcpServers": {
"gemini-docs": {
"url": "https://<your-service-url>/mcp"
}
}
}For local development with stdio (if supported by your client):
{
"mcpServers": {
"gemini-docs": {
"command": "gemini-docs-mcp"
}
}
}Tools
search_documentation(queries: list[str]): Performs a full-text search on Gemini documentation for the given list of queries (max 3).get_capability_page(capability: str = None): Get a list of capabilities or content for a specific one.get_current_model(): Get documentation for current Gemini models.
License
MIT
Test Results
We run a comprehensive evaluation harness to ensure the MCP server provides accurate and up-to-date code examples. The tests cover both Python and TypeScript SDKs.
Metric | Value |
Total Tests | 117 |
Passed | 114 |
Failed | 3 |
Last updated: 2025-11-03 13:29:01
You can find the detailed test results in tests/result.json.
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.