Skip to main content
Glama
riotofgeese

Gemini MCP Server

by riotofgeese

gemini

Run multi-turn conversations with Google's Gemini AI using customizable prompts, access policies, and system instructions for development tasks.

Instructions

Run a Gemini session. Similar to Codex but uses Google Gemini 3 Pro Preview.

Supports configuration parameters matching the Codex Config struct:

  • prompt: The initial user prompt to start the conversation (required)

  • cwd: Working directory context

  • sandbox: Access policy ("read-only", "workspace-write", "danger-full-access")

  • base-instructions: Override default system instructions

  • developer-instructions: Additional developer context

  • model: Optional override for model (default: gemini-3-pro-preview)

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
promptYesThe initial user prompt to start the Gemini conversation
cwdNoWorking directory for context
sandboxNoAccess policy mode
base-instructionsNoOverride the default system instructions
developer-instructionsNoDeveloper instructions for additional context
modelNoModel override (default: gemini-3-pro-preview)
configNoAdditional config settings (passthrough)

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/riotofgeese/gemini-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server