consult_ai
Consult AI models via OpenRouter for tasks like coding, analysis, or questions. Auto-selects the best model or specify multiple models for sequential consultation with conversation history support.
Instructions
Consult with an AI model via OpenRouter. You can either specify a model or let the system auto-select based on your task. For sequential multi-model consultation, use the 'models' parameter to specify multiple models.
Input Schema
Name | Required | Description | Default |
---|---|---|---|
clear_history | No | Optional: Set to true to clear the conversation history for the given conversation_id before processing this request. | |
conversation_id | No | Optional: Conversation ID to maintain context across multiple consultations. Use the same ID for follow-up questions. | |
model | No | Optional: Specific model to use (e.g., 'gemini-2.5-pro', 'gpt-5-codex', 'grok-code-fast-1'). If not specified, the best model will be automatically selected based on the task. | |
models | No | Optional: Array of models to consult sequentially (e.g., ["gemini-2.5-pro", "gpt-5-codex"]). When specified, the prompt will be sent to each model in order and responses will be aggregated. This parameter takes precedence over 'model'. | |
prompt | Yes | The question or task to send to the AI model | |
task_description | No | Optional: Brief description of the task type to help auto-select the best model (e.g., 'coding task', 'complex analysis', 'quick question') |
Input Schema (JSON Schema)
{
"properties": {
"clear_history": {
"description": "Optional: Set to true to clear the conversation history for the given conversation_id before processing this request.",
"type": "boolean"
},
"conversation_id": {
"description": "Optional: Conversation ID to maintain context across multiple consultations. Use the same ID for follow-up questions.",
"type": "string"
},
"model": {
"description": "Optional: Specific model to use (e.g., 'gemini-2.5-pro', 'gpt-5-codex', 'grok-code-fast-1'). If not specified, the best model will be automatically selected based on the task.",
"enum": [
"gemini-2.5-pro",
"gpt-5-codex",
"grok-code-fast-1"
],
"type": "string"
},
"models": {
"description": "Optional: Array of models to consult sequentially (e.g., [\"gemini-2.5-pro\", \"gpt-5-codex\"]). When specified, the prompt will be sent to each model in order and responses will be aggregated. This parameter takes precedence over 'model'.",
"items": {
"enum": [
"gemini-2.5-pro",
"gpt-5-codex",
"grok-code-fast-1"
],
"type": "string"
},
"type": "array"
},
"prompt": {
"description": "The question or task to send to the AI model",
"type": "string"
},
"task_description": {
"description": "Optional: Brief description of the task type to help auto-select the best model (e.g., 'coding task', 'complex analysis', 'quick question')",
"type": "string"
}
},
"required": [
"prompt"
],
"type": "object"
}