Skip to main content
Glama
tusharpatil2912

Pollinations Multimodal MCP Server

generateText

Generate text responses from prompts using AI models, with options for reproducible results, system behavior settings, and output formats.

Instructions

Generate text from a prompt using the Pollinations Text API

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
promptYesThe text prompt to generate a response for
modelNoModel to use for text generation (default: "openai")
optionsNoAdditional options for text generation

Implementation Reference

  • The handler function that implements the generateText tool logic, fetching generated text from the Pollinations Text API based on the provided prompt and options.
    async function generateText(params) { const { prompt, model = "openai", options = {} } = params; if (!prompt || typeof prompt !== "string") { throw new Error("Prompt is required and must be a string"); } const { seed, systemPrompt, json, isPrivate } = options; // Prepare query parameters const queryParams = { model, seed, ...(systemPrompt && { system: encodeURIComponent(systemPrompt) }), ...(json && { json: "true" }), ...(isPrivate && { private: "true" }), }; // Construct the URL const encodedPrompt = encodeURIComponent(prompt); const url = buildUrl(TEXT_API_BASE_URL, encodedPrompt, queryParams); try { // Fetch the text from the URL const response = await fetch(url); if (!response.ok) { throw new Error(`Failed to generate text: ${response.statusText}`); } // Get the text response const textResponse = await response.text(); // Return the response in MCP format return createMCPResponse([createTextContent(textResponse)]); } catch (error) { console.error("Error generating text:", error); throw error; } }
  • Zod schema defining the input parameters for the generateText tool, including prompt, model, and options.
    prompt: z .string() .describe("The text prompt to generate a response for"), model: z .string() .optional() .describe( 'Model to use for text generation (default: "openai")', ), options: z .object({ seed: z .number() .optional() .describe("Seed for reproducible results"), systemPrompt: z .string() .optional() .describe( "Optional system prompt to set the behavior of the AI", ), json: z .boolean() .optional() .describe( "Set to true to receive response in JSON format", ), isPrivate: z .boolean() .optional() .describe( "Set to true to prevent the response from appearing in the public feed", ), }) .optional() .describe("Additional options for text generation"),
  • The registration entry for the generateText tool in the textTools array, including name, description, schema, and handler reference. This array is spread into toolDefinitions and registered via server.tool() in src/index.js.
    [ "generateText", "Generate text from a prompt using the Pollinations Text API", { prompt: z .string() .describe("The text prompt to generate a response for"), model: z .string() .optional() .describe( 'Model to use for text generation (default: "openai")', ), options: z .object({ seed: z .number() .optional() .describe("Seed for reproducible results"), systemPrompt: z .string() .optional() .describe( "Optional system prompt to set the behavior of the AI", ), json: z .boolean() .optional() .describe( "Set to true to receive response in JSON format", ), isPrivate: z .boolean() .optional() .describe( "Set to true to prevent the response from appearing in the public feed", ), }) .optional() .describe("Additional options for text generation"), }, generateText, ],

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/tusharpatil2912/pollinations-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server