Skip to main content
Glama
NightTrek

Ollama MCP Server

by NightTrek

rm

Delete local AI models from the Ollama MCP Server to manage storage and organize available models.

Instructions

Remove a model

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
nameYesName of the model to remove

Implementation Reference

  • The main handler function for the 'rm' tool. It executes the 'ollama rm {name}' command using execAsync and returns the stdout or stderr as text content, or throws an error if it fails.
    private async handleRemove(args: any) {
      try {
        const { stdout, stderr } = await execAsync(`ollama rm ${args.name}`);
        return {
          content: [
            {
              type: 'text',
              text: stdout || stderr,
            },
          ],
        };
      } catch (error) {
        throw new McpError(ErrorCode.InternalError, `Failed to remove model: ${formatError(error)}`);
      }
    }
  • src/index.ts:192-206 (registration)
    Registration of the 'rm' tool in the listTools response, including name, description, and input schema.
    {
      name: 'rm',
      description: 'Remove a model',
      inputSchema: {
        type: 'object',
        properties: {
          name: {
            type: 'string',
            description: 'Name of the model to remove',
          },
        },
        required: ['name'],
        additionalProperties: false,
      },
    },
  • Input schema definition for the 'rm' tool, specifying an object with a required 'name' string property.
    inputSchema: {
      type: 'object',
      properties: {
        name: {
          type: 'string',
          description: 'Name of the model to remove',
        },
      },
      required: ['name'],
      additionalProperties: false,
    },
  • Dispatch case in the main CallToolRequestSchema handler that routes 'rm' tool calls to the handleRemove function.
    case 'rm':
      return await this.handleRemove(request.params.arguments);

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/NightTrek/Ollama-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server