Skip to main content
Glama

Gemini MCP Server

by mintmcqueen

chat

Send messages to Google's Gemini AI for text analysis and multimodal tasks. Include uploaded files for code review, document analysis, or image processing while maintaining conversation continuity.

Instructions

SEND MESSAGE TO GEMINI (with optional files) - Chat with Gemini, optionally including uploaded files for multimodal analysis. TYPICAL USE: 0-2 files for most tasks (code review, document analysis, image description). SCALES TO: 40+ files when needed for comprehensive analysis. WORKFLOW: 1) Upload files first using upload_file (single) or upload_multiple_files (multiple), 2) Pass returned URIs in fileUris array, 3) Include your text prompt in message. The server handles file object caching and proper API formatting. Supports conversation continuity via conversationId. RETURNS: response text, token usage, conversation ID. Files are passed as direct objects to Gemini (not fileData structures). Auto-retrieves missing files from API if not cached.

Input Schema

NameRequiredDescriptionDefault
messageYesThe message to send to Gemini
modelNoThe Gemini model to usegemini-2.5-pro
fileUrisNoArray of file URIs from previously uploaded files
temperatureNoControls randomness in responses (0.0 to 2.0)
maxTokensNoMaximum tokens in response
conversationIdNoOptional conversation ID to continue a previous chat

Input Schema (JSON Schema)

{ "properties": { "conversationId": { "description": "Optional conversation ID to continue a previous chat", "type": "string" }, "fileUris": { "description": "Array of file URIs from previously uploaded files", "items": { "type": "string" }, "type": "array" }, "maxTokens": { "default": 15000, "description": "Maximum tokens in response", "maximum": 500000, "minimum": 1, "type": "number" }, "message": { "description": "The message to send to Gemini", "type": "string" }, "model": { "default": "gemini-2.5-pro", "description": "The Gemini model to use", "enum": [ "gemini-2.5-pro", "gemini-2.5-flash", "gemini-2.0-flash-exp" ], "type": "string" }, "temperature": { "default": 1, "description": "Controls randomness in responses (0.0 to 2.0)", "maximum": 2, "minimum": 0, "type": "number" } }, "required": [ "message" ], "type": "object" }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mintmcqueen/gemini-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server