Skip to main content
Glama

MCP Ollama Consult Server

by Atomic-Germ

compare_ollama_models

Compare outputs from multiple Ollama models by running the same prompt simultaneously to evaluate different AI responses side-by-side.

Instructions

Run the same prompt against multiple Ollama models and return their outputs side-by-side for comparison.

Input Schema

NameRequiredDescriptionDefault
modelsNo
promptYes
system_promptNo

Input Schema (JSON Schema)

{ "properties": { "models": { "items": { "type": "string" }, "type": "array" }, "prompt": { "type": "string" }, "system_prompt": { "type": "string" } }, "required": [ "prompt" ], "type": "object" }

Other Tools from MCP Ollama Consult Server

Related Tools

    MCP directory API

    We provide all the information about MCP servers via our MCP API.

    curl -X GET 'https://glama.ai/api/mcp/v1/servers/Atomic-Germ/mcp-consult'

    If you have feedback or need assistance with the MCP directory API, please join our Discord server