Skip to main content
Glama
hyzhak
by hyzhak

pull

Download AI models from registries to your local machine for offline use with Ollama's local LLM capabilities.

Instructions

Pull a model from a registry

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
nameYes

Implementation Reference

  • The handler function that executes the 'pull' tool logic: pulls the specified model using ollama.pull({ model: name }) and formats the result or error response.
    async ({ name }) => { try { const result = await ollama.pull({ model: name }); return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] }; } catch (error) { return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true }; } }
  • Zod input schema for the 'pull' tool requiring a 'name' parameter of type string.
    inputSchema: { name: z.string() },
  • src/index.ts:83-98 (registration)
    Registration of the 'pull' tool using server.registerTool, including schema and handler.
    server.registerTool( "pull", { title: "Pull model", description: "Pull a model from a registry", inputSchema: { name: z.string() }, }, async ({ name }) => { try { const result = await ollama.pull({ model: name }); return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] }; } catch (error) { return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true }; } } );

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hyzhak/ollama-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server