Skip to main content
Glama

generate_text

Generate text content using Google's Gemini Pro AI model by providing a prompt. This tool creates written responses for various applications including content creation, brainstorming, and information synthesis.

Instructions

Generate text using Gemini 2.5 Pro model

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
maxTokensNoMaximum number of tokens to generate (optional)
promptYesThe text prompt to send to Gemini
temperatureNoTemperature for text generation (0.0 to 2.0)

Implementation Reference

  • The main handler function that executes the 'generate_text' tool. It takes arguments like prompt and optional temperature/maxTokens, calls the Gemini model to generate content, and returns the generated text.
    private async handleTextGeneration(args: any) { const { prompt, temperature = 1.0 } = args; const generationConfig = { temperature: Math.max(0, Math.min(2, temperature)), maxOutputTokens: args.maxTokens || 1000, }; const result = await this.model.generateContent({ contents: [{ role: "user", parts: [{ text: prompt }] }], generationConfig, }); const response = result.response; const text = response.text(); return { content: [ { type: "text", text: text, }, ], }; }
  • Input schema definition for the 'generate_text' tool, specifying prompt as required and optional maxTokens and temperature parameters.
    inputSchema: { type: "object", properties: { prompt: { type: "string", description: "The text prompt to send to Gemini", }, maxTokens: { type: "number", description: "Maximum number of tokens to generate (optional)", default: 1000, }, temperature: { type: "number", description: "Temperature for text generation (0.0 to 2.0)", default: 1.0, }, }, required: ["prompt"], },
  • src/index.ts:49-72 (registration)
    Registration of the 'generate_text' tool in the listTools response, including name, description, and input schema.
    { name: "generate_text", description: "Generate text using Gemini 2.5 Pro model", inputSchema: { type: "object", properties: { prompt: { type: "string", description: "The text prompt to send to Gemini", }, maxTokens: { type: "number", description: "Maximum number of tokens to generate (optional)", default: 1000, }, temperature: { type: "number", description: "Temperature for text generation (0.0 to 2.0)", default: 1.0, }, }, required: ["prompt"], }, },
  • src/index.ts:102-103 (registration)
    Switch case that registers and dispatches 'generate_text' tool calls to the handleTextGeneration handler.
    case "generate_text": return await this.handleTextGeneration(args);

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/lutic1/Google-MCP-Server-'

If you have feedback or need assistance with the MCP directory API, please join our Discord server