Skip to main content
Glama

Gemimi MCP Server (in Python)

Model Context Protocol (MCP) server for Gemimi integration, built on FastMCP.

This server is implemented in Python, with fastmcp.

Quick Start

  1. Build the Docker image:

docker build -t gemini-mcp-server .

Related MCP server: Gemini MCP Server

Integration with Cursor/Claude

In MCP Settings -> Add MCP server, add this config:

{ "mcpServers": { "gemini": { "command": "docker", "args": [ "run", "--rm", "-i", "--network", "host", "-e", "GEMINI_API_KEY", "-e", "GEMINI_MODEL", "-e", "GEMINI_BASE_URL", "-e", "HTTP_PROXY", "-e", "HTTPS_PROXY", "gemini-mcp-server:latest" ], "env": { "GEMINI_API_KEY":"your_api_key_here", "GEMINI_MODEL":"gemini-2.5-flash", "GEMINI_BASE_URL":"https://generativelanguage.googleapis.com/v1beta/openai/", "HTTP_PROXY":"http://127.0.0.1:17890", "HTTPS_PROXY":"http://127.0.0.1:17890" } } } }

Note: Don't forget to replace GEMINI_API_KEYGEMINI_MODELGEMINI_BASE_URLHTTP_PROXYHTTPS_PROXY values with your actual Gemimi credentials and instance URL.

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/lucky-dersan/gemini-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server