Skip to main content
Glama
hyzhak
by hyzhak

show

Retrieve detailed information about a specific AI model by specifying its name. Integrates with Ollama MCP Server to manage and interact with local AI models securely.

Instructions

Show information for a model

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
nameYes

Implementation Reference

  • Handler function that executes the 'show' tool: fetches model info via ollama.show and returns JSON or error.
    async ({ name }) => { try { const result = await ollama.show({ model: name }); return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] }; } catch (error) { return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true }; } }
  • Zod input schema defining required 'name' parameter as string.
    inputSchema: { name: z.string() },
  • src/index.ts:47-62 (registration)
    Full registration of the 'show' tool including name, metadata, schema, and inline handler.
    server.registerTool( "show", { title: "Show model info", description: "Show information for a model", inputSchema: { name: z.string() }, }, async ({ name }) => { try { const result = await ollama.show({ model: name }); return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] }; } catch (error) { return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true }; } } );

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hyzhak/ollama-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server