Provides integration with Google's Gemini AI models, enabling access to models like gemini-2.5-flash through Google's generative language API.
Gemimi MCP Server (in Python)
Model Context Protocol (MCP) server for Gemimi integration, built on FastMCP.
This server is implemented in Python, with fastmcp.
Quick Start
Build the Docker image:
docker build -t gemini-mcp-server .
Related MCP server: Gemini MCP Server
Integration with Cursor/Claude
In MCP Settings -> Add MCP server, add this config:
{
"mcpServers": {
"gemini": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"--network",
"host",
"-e",
"GEMINI_API_KEY",
"-e",
"GEMINI_MODEL",
"-e",
"GEMINI_BASE_URL",
"-e",
"HTTP_PROXY",
"-e",
"HTTPS_PROXY",
"gemini-mcp-server:latest"
],
"env": {
"GEMINI_API_KEY":"your_api_key_here",
"GEMINI_MODEL":"gemini-2.5-flash",
"GEMINI_BASE_URL":"https://generativelanguage.googleapis.com/v1beta/openai/",
"HTTP_PROXY":"http://127.0.0.1:17890",
"HTTPS_PROXY":"http://127.0.0.1:17890"
}
}
}
}
Note: Don't forget to replace GEMINI_API_KEY、GEMINI_MODEL、GEMINI_BASE_URL、HTTP_PROXY、HTTPS_PROXY values with your actual Gemimi credentials and instance URL.