Skip to main content
Glama
hyzhak

Ollama MCP Server

by hyzhak

push

Upload a local AI model to a registry for sharing or deployment. This tool enables model distribution within the Ollama MCP Server ecosystem.

Instructions

Push a model to a registry

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
nameYes

Implementation Reference

  • Handler function that calls ollama.push(model: name) and formats the result or error response.
    async ({ name }) => {
      try {
        const result = await ollama.push({ model: name });
        return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] };
      } catch (error) {
        return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true };
      }
    }
  • Input schema defining the 'name' parameter as a string, along with title and description.
    {
      title: "Push model",
      description: "Push a model to a registry",
      inputSchema: { name: z.string() },
    },
  • src/index.ts:101-116 (registration)
    Registers the 'push' tool with the MCP server using server.registerTool.
    server.registerTool(
      "push",
      {
        title: "Push model",
        description: "Push a model to a registry",
        inputSchema: { name: z.string() },
      },
      async ({ name }) => {
        try {
          const result = await ollama.push({ model: name });
          return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] };
        } catch (error) {
          return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true };
        }
      }
    );
Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hyzhak/ollama-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server