Skip to main content
Glama
hyzhak
by hyzhak

list

View available local AI models to select and manage them for private, controlled AI interactions.

Instructions

List all models in Ollama

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault

No arguments

Implementation Reference

  • Handler function that executes the 'list' tool: fetches models using ollama.list(), formats as JSON, or returns formatted error.
    async () => { try { const result = await ollama.list(); return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] }; } catch (error) { return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true }; } }
  • Input schema and metadata (title, description) for the 'list' tool. No input parameters required.
    { title: "List models", description: "List all models in Ollama", inputSchema: {}, },
  • src/index.ts:29-44 (registration)
    Registration of the 'list' tool using server.registerTool, including name, schema, and handler.
    server.registerTool( "list", { title: "List models", description: "List all models in Ollama", inputSchema: {}, }, async () => { try { const result = await ollama.list(); return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] }; } catch (error) { return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true }; } } );
  • Helper function used in the 'list' handler (and others) to format errors.
    const formatError = (error: unknown): string => error instanceof Error ? error.message : String(error);

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hyzhak/ollama-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server