Skip to main content
Glama

OpenRouter MCP Multimodal Server

by hoangdn3

mcp_openrouter_chat_completion

Send messages to OpenRouter.ai models for text chat and image analysis, supporting multimodal conversations with automatic image optimization.

Instructions

Send a message to OpenRouter.ai and get a response

Input Schema

NameRequiredDescriptionDefault
modelNoThe model to use (e.g., "google/gemini-2.5-pro-exp-03-25:free", "undi95/toppy-m-7b:free"). If not provided, uses the default model if set.
messagesYesAn array of conversation messages with roles and content
temperatureNoSampling temperature (0-2)

Input Schema (JSON Schema)

{ "properties": { "messages": { "description": "An array of conversation messages with roles and content", "items": { "properties": { "content": { "oneOf": [ { "description": "The text content of the message", "type": "string" }, { "description": "Array of content parts for multimodal messages (text and images)", "items": { "properties": { "image_url": { "description": "The image URL object (for image_url type)", "properties": { "url": { "description": "URL of the image (can be a data URL with base64)", "type": "string" } }, "required": [ "url" ], "type": "object" }, "text": { "description": "The text content (for text type)", "type": "string" }, "type": { "description": "The type of content (text or image)", "enum": [ "text", "image_url" ], "type": "string" } }, "required": [ "type" ], "type": "object" }, "type": "array" } ] }, "role": { "description": "The role of the message sender", "enum": [ "system", "user", "assistant" ], "type": "string" } }, "required": [ "role", "content" ], "type": "object" }, "maxItems": 100, "minItems": 1, "type": "array" }, "model": { "description": "The model to use (e.g., \"google/gemini-2.5-pro-exp-03-25:free\", \"undi95/toppy-m-7b:free\"). If not provided, uses the default model if set.", "type": "string" }, "temperature": { "description": "Sampling temperature (0-2)", "maximum": 2, "minimum": 0, "type": "number" } }, "required": [ "messages" ], "type": "object" }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hoangdn3/mcp-ocr-fallback'

If you have feedback or need assistance with the MCP directory API, please join our Discord server