Skip to main content
Glama

Gemini Context MCP Server

by ogoldberg
.env.example471 B
# Required settings GEMINI_API_KEY=your-api-key-here GEMINI_MODEL=gemini-2.0-flash # or other model variants like gemini-pro-vision # Optional model settings GEMINI_TEMPERATURE=0.7 GEMINI_TOP_K=40 GEMINI_TOP_P=0.9 GEMINI_MAX_OUTPUT_TOKENS=2097152 # Optional server settings MAX_SESSIONS=50 SESSION_TIMEOUT_MINUTES=120 MAX_MESSAGE_LENGTH=1000000 MAX_TOKENS_PER_SESSION=2097152 DEBUG=false # Server configuration NODE_ENV=development # development, test, or production

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ogoldberg/gemini-context-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server