ask_model
Query AI models with prompts to get responses and metadata. Configure model behavior, temperature, and response format for customized outputs.
Instructions
Query any AI model with a prompt. Returns the model's response with metadata.
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| model | Yes | Model ID to query (e.g. 'gpt-4o', 'gemini-2.5-pro') | |
| prompt | Yes | The prompt to send to the model | |
| system_prompt | No | Optional system prompt to set model behavior | |
| temperature | No | Sampling temperature (0 = deterministic, 2 = creative) | |
| max_tokens | No | Maximum tokens in response (default: 1024) | |
| format | No | Response format — 'brief' for token-efficient summary, 'detailed' for full response | detailed |