Skip to main content
Glama
hyzhak
by hyzhak

rm

Delete a local AI model from your Ollama environment to manage storage and organize available models.

Instructions

Remove a model

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
nameYes

Implementation Reference

  • The handler function for the 'rm' tool that executes ollama.delete({ model: name }) to remove the model and handles errors using formatError.
    async ({ name }) => { try { const result = await ollama.delete({ model: name }); return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] }; } catch (error) { return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true }; } }
  • Input schema defining the required 'name' parameter as a string using Zod.
    inputSchema: { name: z.string() },
  • src/index.ts:137-152 (registration)
    Full registration of the 'rm' tool, including schema and handler function.
    server.registerTool( "rm", { title: "Remove model", description: "Remove a model", inputSchema: { name: z.string() }, }, async ({ name }) => { try { const result = await ollama.delete({ model: name }); return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] }; } catch (error) { return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true }; } } );

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hyzhak/ollama-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server