Skip to main content
Glama

ask_model

Query AI models with prompts to get responses and metadata. Configure model behavior, temperature, and response format for customized outputs.

Instructions

Query any AI model with a prompt. Returns the model's response with metadata.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
modelYesModel ID to query (e.g. 'gpt-4o', 'gemini-2.5-pro')
promptYesThe prompt to send to the model
system_promptNoOptional system prompt to set model behavior
temperatureNoSampling temperature (0 = deterministic, 2 = creative)
max_tokensNoMaximum tokens in response (default: 1024)
formatNoResponse format — 'brief' for token-efficient summary, 'detailed' for full responsedetailed

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Pickle-Pixel/HydraMCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server