Skip to main content
Glama

MCP Server Gemini

by gurr-i

generate_text

Generate text using Google's Gemini AI models with customizable parameters like temperature, token limits, and optional features including JSON mode, Google Search grounding, and conversation context.

Instructions

Generate text using Google Gemini with advanced features

Input Schema

NameRequiredDescriptionDefault
conversationIdNoID for maintaining conversation context
groundingNoEnable Google Search grounding for up-to-date information
jsonModeNoEnable JSON mode for structured output
jsonSchemaNoJSON schema as a string for structured output (when jsonMode is true)
maxTokensNoMaximum tokens to generate
modelNoSpecific Gemini model to usegemini-2.5-flash
promptYesThe prompt to send to Gemini
safetySettingsNoSafety settings as JSON string for content filtering
systemInstructionNoSystem instruction to guide model behavior
temperatureNoTemperature for generation (0-2)
topKNoTop-k sampling parameter
topPNoTop-p (nucleus) sampling parameter

Input Schema (JSON Schema)

{ "properties": { "conversationId": { "description": "ID for maintaining conversation context", "type": "string" }, "grounding": { "default": false, "description": "Enable Google Search grounding for up-to-date information", "type": "boolean" }, "jsonMode": { "default": false, "description": "Enable JSON mode for structured output", "type": "boolean" }, "jsonSchema": { "description": "JSON schema as a string for structured output (when jsonMode is true)", "type": "string" }, "maxTokens": { "default": 2048, "description": "Maximum tokens to generate", "type": "number" }, "model": { "default": "gemini-2.5-flash", "description": "Specific Gemini model to use", "enum": [ "gemini-2.5-pro", "gemini-2.5-flash", "gemini-2.5-flash-lite", "gemini-2.0-flash", "gemini-2.0-flash-lite", "gemini-2.0-pro-experimental", "gemini-1.5-pro", "gemini-1.5-flash" ], "type": "string" }, "prompt": { "description": "The prompt to send to Gemini", "type": "string" }, "safetySettings": { "description": "Safety settings as JSON string for content filtering", "type": "string" }, "systemInstruction": { "description": "System instruction to guide model behavior", "type": "string" }, "temperature": { "default": 0.7, "description": "Temperature for generation (0-2)", "maximum": 2, "minimum": 0, "type": "number" }, "topK": { "default": 40, "description": "Top-k sampling parameter", "type": "number" }, "topP": { "default": 0.95, "description": "Top-p (nucleus) sampling parameter", "type": "number" } }, "required": [ "prompt" ], "type": "object" }

Other Tools from MCP Server Gemini

Related Tools

    MCP directory API

    We provide all the information about MCP servers via our MCP API.

    curl -X GET 'https://glama.ai/api/mcp/v1/servers/gurr-i/mcp-server-gemini-pro'

    If you have feedback or need assistance with the MCP directory API, please join our Discord server