Skip to main content
Glama

AI Consultant MCP Server

by filipkrayem

consult_ai

Consult AI models via OpenRouter for tasks like coding, analysis, or questions. Auto-selects the best model or specify multiple models for sequential consultation with conversation history support.

Instructions

Consult with an AI model via OpenRouter. You can either specify a model or let the system auto-select based on your task. For sequential multi-model consultation, use the 'models' parameter to specify multiple models.

Input Schema

NameRequiredDescriptionDefault
clear_historyNoOptional: Set to true to clear the conversation history for the given conversation_id before processing this request.
conversation_idNoOptional: Conversation ID to maintain context across multiple consultations. Use the same ID for follow-up questions.
modelNoOptional: Specific model to use (e.g., 'gemini-2.5-pro', 'gpt-5-codex', 'grok-code-fast-1'). If not specified, the best model will be automatically selected based on the task.
modelsNoOptional: Array of models to consult sequentially (e.g., ["gemini-2.5-pro", "gpt-5-codex"]). When specified, the prompt will be sent to each model in order and responses will be aggregated. This parameter takes precedence over 'model'.
promptYesThe question or task to send to the AI model
task_descriptionNoOptional: Brief description of the task type to help auto-select the best model (e.g., 'coding task', 'complex analysis', 'quick question')

Input Schema (JSON Schema)

{ "properties": { "clear_history": { "description": "Optional: Set to true to clear the conversation history for the given conversation_id before processing this request.", "type": "boolean" }, "conversation_id": { "description": "Optional: Conversation ID to maintain context across multiple consultations. Use the same ID for follow-up questions.", "type": "string" }, "model": { "description": "Optional: Specific model to use (e.g., 'gemini-2.5-pro', 'gpt-5-codex', 'grok-code-fast-1'). If not specified, the best model will be automatically selected based on the task.", "enum": [ "gemini-2.5-pro", "gpt-5-codex", "grok-code-fast-1" ], "type": "string" }, "models": { "description": "Optional: Array of models to consult sequentially (e.g., [\"gemini-2.5-pro\", \"gpt-5-codex\"]). When specified, the prompt will be sent to each model in order and responses will be aggregated. This parameter takes precedence over 'model'.", "items": { "enum": [ "gemini-2.5-pro", "gpt-5-codex", "grok-code-fast-1" ], "type": "string" }, "type": "array" }, "prompt": { "description": "The question or task to send to the AI model", "type": "string" }, "task_description": { "description": "Optional: Brief description of the task type to help auto-select the best model (e.g., 'coding task', 'complex analysis', 'quick question')", "type": "string" } }, "required": [ "prompt" ], "type": "object" }

Other Tools from AI Consultant MCP Server

Related Tools

    MCP directory API

    We provide all the information about MCP servers via our MCP API.

    curl -X GET 'https://glama.ai/api/mcp/v1/servers/filipkrayem/ai-consultant-mcp'

    If you have feedback or need assistance with the MCP directory API, please join our Discord server