Skip to main content
Glama
MIT License
3

generate_text

Generate coherent and contextually relevant text responses by providing a prompt. Adjust parameters like temperature, max tokens, and top-K to control output creativity and length.

Input Schema

NameRequiredDescriptionDefault
maxOutputTokensNo
promptYes
streamNo
temperatureNo
topKNo
topPNo

Input Schema (JSON Schema)

{ "$schema": "http://json-schema.org/draft-07/schema#", "additionalProperties": false, "properties": { "maxOutputTokens": { "maximum": 8192, "minimum": 1, "type": "number" }, "prompt": { "minLength": 1, "type": "string" }, "stream": { "type": "boolean" }, "temperature": { "maximum": 1, "minimum": 0, "type": "number" }, "topK": { "maximum": 40, "minimum": 1, "type": "number" }, "topP": { "maximum": 1, "minimum": 0, "type": "number" } }, "required": [ "prompt" ], "type": "object" }

You must be authenticated.

Other Tools from Gemini MCP Server

Related Tools

    MCP directory API

    We provide all the information about MCP servers via our MCP API.

    curl -X GET 'https://glama.ai/api/mcp/v1/servers/IA-Entertainment-git-organization/gemini-mcp-server'

    If you have feedback or need assistance with the MCP directory API, please join our Discord server