Skip to main content
Glama

Gemini MCP Server

by lucky-dersan
curl.sh774 B
``` export GEMINI_API_KEY=your_api_key_here curl "https://generativelanguage.googleapis.com/v1beta/models/gemini-2.0-flash:generateContent?key=${GEMINI_API_KEY}" \ -H 'Content-Type: application/json' \ -X POST \ -d '{ "contents": [ { "parts": [ { "text": "Explain how AI works in a few words" } ] } ] }' ``` ``` export GEMINI_API_KEY=your_api_key_here export GEMINI_MODEL=gemini-2.5-flash export GEMINI_BASE_URL=https://generativelanguage.googleapis.com/v1beta/openai/ export HTTP_PROXY=http://127.0.0.1:17890 export HTTPS_PROXY=http://127.0.0.1:17890 docker run --rm -i --network host -e GEMINI_API_KEY -e GEMINI_MODEL -e GEMINI_BASE_URL -e HTTP_PROXY -e HTTPS_PROXY gemini-mcp-server:latest ```

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/lucky-dersan/gemini-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server