Skip to main content
Glama

Pollinations Multimodal MCP Server

generateText

Generate customizable text responses from prompts using the Pollinations Text API. Specify models, set system behaviors, and receive outputs in JSON format for integration needs.

Instructions

Generate text from a prompt using the Pollinations Text API

Input Schema

NameRequiredDescriptionDefault
modelNoModel to use for text generation (default: "openai")
optionsNoAdditional options for text generation
promptYesThe text prompt to generate a response for

Input Schema (JSON Schema)

{ "$schema": "http://json-schema.org/draft-07/schema#", "additionalProperties": false, "properties": { "model": { "description": "Model to use for text generation (default: \"openai\")", "type": "string" }, "options": { "additionalProperties": false, "description": "Additional options for text generation", "properties": { "isPrivate": { "description": "Set to true to prevent the response from appearing in the public feed", "type": "boolean" }, "json": { "description": "Set to true to receive response in JSON format", "type": "boolean" }, "seed": { "description": "Seed for reproducible results", "type": "number" }, "systemPrompt": { "description": "Optional system prompt to set the behavior of the AI", "type": "string" } }, "type": "object" }, "prompt": { "description": "The text prompt to generate a response for", "type": "string" } }, "required": [ "prompt" ], "type": "object" }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/tusharpatil2912/pollinations-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server