compare_models
Compare responses from multiple models by running the same prompt, with optional system prompt, to see outputs side by side for evaluation.
Instructions
Run the same prompt against multiple models and return all responses side-by-side for comparison.
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| prompt | Yes | ||
| model_names | Yes | ||
| system_prompt | No |
Output Schema
| Name | Required | Description | Default |
|---|---|---|---|
No arguments | |||
Implementation Reference
- src/foundry_reverse/server.py:122-128 (registration)Registration of the compare_models tool via the @mcp.tool decorator with FastMCP.
@mcp.tool( name="compare_models", description=( "Run the same prompt against multiple models and return all responses " "side-by-side for comparison." ), ) - src/foundry_reverse/server.py:129-146 (handler)Handler function that runs the same prompt against multiple Ollama models and returns responses side-by-side.
async def compare_models( prompt: str, model_names: list[str], system_prompt: str | None = None, ) -> dict[str, Any]: """ Args: prompt: The prompt to send to each model. model_names: List of model names to compare. system_prompt: Optional system instruction applied to all models. """ results: dict[str, str] = {} for name in model_names: try: results[name] = await oc.generate(model=name, prompt=prompt, system=system_prompt) except Exception as exc: # noqa: BLE001 results[name] = f"ERROR: {exc}" return {"prompt": prompt, "results": results} - Helper function that makes the actual Ollama API call for text generation, used by the compare_models handler.
async def generate( model: str, prompt: str, system: str | None = None, options: dict[str, Any] | None = None, ) -> str: payload: dict[str, Any] = { "model": model, "prompt": prompt, "stream": False, } if system: payload["system"] = system if options: payload["options"] = options async with _client() as c: r = await c.post("/api/generate", json=payload) r.raise_for_status() return r.json().get("response", "")