Provides integration with Google's Gemini AI models, enabling access to models like gemini-2.5-flash through Google's generative language API.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Gemini MCP Serversummarize this article about quantum computing"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Gemimi MCP Server (in Python)
Model Context Protocol (MCP) server for Gemimi integration, built on FastMCP.
This server is implemented in Python, with fastmcp.
Quick Start
Build the Docker image:
Related MCP server: Gemini MCP Server
Integration with Cursor/Claude
In MCP Settings -> Add MCP server, add this config:
Note: Don't forget to replace GEMINI_API_KEY、GEMINI_MODEL、GEMINI_BASE_URL、HTTP_PROXY、HTTPS_PROXY values with your actual Gemimi credentials and instance URL.