list_models
List all locally available Ollama models on your system, providing a clear view of your local AI assets for selection and management without cloud dependencies.
Instructions
List all locally available Ollama models.
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
No arguments | |||
Output Schema
| Name | Required | Description | Default |
|---|---|---|---|
| result | Yes |
Implementation Reference
- src/foundry_reverse/server.py:56-66 (handler)The MCP tool handler for 'list_models'. Fetches models via ollama_client.list_models() and transforms results into a simplified format with name, size_gb, modified_at, and digest fields.
async def list_models() -> list[dict[str, Any]]: models = await oc.list_models() return [ { "name": m.get("name"), "size_gb": round(m.get("size", 0) / 1_073_741_824, 2), "modified_at": m.get("modified_at"), "digest": m.get("digest", "")[:12], } for m in models ] - src/foundry_reverse/server.py:52-55 (registration)The @mcp.tool decorator registering 'list_models' as an MCP tool with FastMCP, including the description 'List all locally available Ollama models.'
@mcp.tool( name="list_models", description="List all locally available Ollama models.", ) - The low-level async helper that calls the Ollama REST API endpoint /api/tags to fetch all locally available models and returns the raw JSON list.
async def list_models() -> list[dict[str, Any]]: async with _client() as c: r = await c.get("/api/tags") r.raise_for_status() return r.json().get("models", [])