Skip to main content
Glama

Gemini MCP Server

by lucky-dersan
README.md1.24 kB
# Gemimi MCP Server (in Python) Model Context Protocol (MCP) server for Gemimi integration, built on FastMCP. This server is implemented in Python, with fastmcp. ## Quick Start 1. Build the Docker image: ```bash docker build -t gemini-mcp-server . ``` ## Integration with Cursor/Claude In MCP Settings -> Add MCP server, add this config: ```json { "mcpServers": { "gemini": { "command": "docker", "args": [ "run", "--rm", "-i", "--network", "host", "-e", "GEMINI_API_KEY", "-e", "GEMINI_MODEL", "-e", "GEMINI_BASE_URL", "-e", "HTTP_PROXY", "-e", "HTTPS_PROXY", "gemini-mcp-server:latest" ], "env": { "GEMINI_API_KEY":"your_api_key_here", "GEMINI_MODEL":"gemini-2.5-flash", "GEMINI_BASE_URL":"https://generativelanguage.googleapis.com/v1beta/openai/", "HTTP_PROXY":"http://127.0.0.1:17890", "HTTPS_PROXY":"http://127.0.0.1:17890" } } } } ``` Note: Don't forget to replace `GEMINI_API_KEY`、`GEMINI_MODEL`、`GEMINI_BASE_URL`、`HTTP_PROXY`、`HTTPS_PROXY` values with your actual Gemimi credentials and instance URL.

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/lucky-dersan/gemini-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server