neuroverse_model
Route AI queries to optimal models for multilingual, reasoning, local, or general tasks using OpenAI, Anthropic, Sarvam AI, or Ollama providers.
Instructions
Query the multi-model AI router.
If a prompt is provided, the prompt is sent to the routed model. Otherwise, returns only the routing decision.
Supported providers: OpenAI, Anthropic, Sarvam AI, Ollama.
Args:
task_type (string): multilingual | reasoning | local | general
prompt (string, optional): Prompt to send
Returns: JSON with routing decision and optional model response
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| task_type | No | Task type for routing | general |
| prompt | No | Optional prompt to actually send to the routed model |
Implementation Reference
- npm/src/index.ts:391-409 (handler)The handler implementation for the "neuroverse_model" tool. It routes the task based on task_type and optionally calls the LLM.
async (params) => { const taskType = params.task_type as TaskType; const decision = routeTask(taskType); const result: Record<string, unknown> = { routing: decision }; if (params.prompt) { try { const response = await callLLM(params.prompt, taskType); result["model_response"] = response; } catch (e) { result["model_response"] = `Error: ${e instanceof Error ? e.message : String(e)}`; } } return { content: [{ type: "text" as const, text: JSON.stringify(result, null, 2) }], }; } - npm/src/index.ts:353-364 (schema)Input validation schema for the "neuroverse_model" tool.
const ModelRouteSchema = z .object({ task_type: z .enum(["multilingual", "reasoning", "local", "general"]) .default("general") .describe("Task type for routing"), prompt: z .string() .optional() .describe("Optional prompt to actually send to the routed model"), }) .strict(); - npm/src/index.ts:366-390 (registration)Tool registration for "neuroverse_model" on the server.
server.registerTool( "neuroverse_model", { title: "Model Route", description: `Query the multi-model AI router. If a prompt is provided, the prompt is sent to the routed model. Otherwise, returns only the routing decision. Supported providers: OpenAI, Anthropic, Sarvam AI, Ollama. Args: - task_type (string): multilingual | reasoning | local | general - prompt (string, optional): Prompt to send Returns: JSON with routing decision and optional model response`, inputSchema: ModelRouteSchema, annotations: { readOnlyHint: true, destructiveHint: false, idempotentHint: true, openWorldHint: true, }, },