get_model_info
Obtain detailed information about a specific Ollama model, including configuration and metadata, to support local model management and inference without cloud dependencies.
Instructions
Get detailed information about a specific Ollama model.
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| model_name | Yes |
Output Schema
| Name | Required | Description | Default |
|---|---|---|---|
No arguments | |||
Implementation Reference
- src/foundry_reverse/server.py:69-83 (handler)The MCP tool handler for 'get_model_info'. Registered with @mcp.tool decorator, delegates to oc.get_model_info and truncates the modelfile to 500 chars.
@mcp.tool( name="get_model_info", description="Get detailed information about a specific Ollama model.", ) async def get_model_info(model_name: str) -> dict[str, Any]: """ Args: model_name: The name of the model (e.g. 'llama3', 'mistral:7b'). """ info = await oc.get_model_info(model_name) # Trim the modelfile to avoid huge outputs modelfile = info.get("modelfile", "") if len(modelfile) > 500: info["modelfile"] = modelfile[:500] + "..." return info - src/foundry_reverse/server.py:69-72 (registration)The @mcp.tool decorator registers 'get_model_info' as an MCP tool with FastMCP.
@mcp.tool( name="get_model_info", description="Get detailed information about a specific Ollama model.", ) - The underlying async helper function that calls the Ollama API (POST /api/show) to fetch model info.
async def get_model_info(name: str) -> dict[str, Any]: async with _client() as c: r = await c.post("/api/show", json={"name": name}) r.raise_for_status() return r.json()