getSecondOpinion
Leverage diverse LLM providers to generate responses tailored to your prompts. Select providers, configure models, and adjust parameters for dynamic AI-driven insights on the MindBridge MCP Server.
Instructions
Get responses from various LLM providers
Input Schema
Name | Required | Description | Default |
---|---|---|---|
frequency_penalty | No | ||
maxTokens | No | ||
model | Yes | ||
presence_penalty | No | ||
prompt | Yes | ||
provider | Yes | ||
reasoning_effort | No | ||
stop_sequences | No | ||
stream | No | ||
systemPrompt | No | ||
temperature | No | ||
top_k | No | ||
top_p | No |