pull
Download AI models from registries to your local machine for offline use with Ollama's local LLM capabilities.
Instructions
Pull a model from a registry
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| name | Yes |
Implementation Reference
- src/index.ts:90-97 (handler)The handler function that executes the 'pull' tool logic: pulls the specified model using ollama.pull({ model: name }) and formats the result or error response.async ({ name }) => { try { const result = await ollama.pull({ model: name }); return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] }; } catch (error) { return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true }; } }
- src/index.ts:88-88 (schema)Zod input schema for the 'pull' tool requiring a 'name' parameter of type string.inputSchema: { name: z.string() },
- src/index.ts:83-98 (registration)Registration of the 'pull' tool using server.registerTool, including schema and handler.server.registerTool( "pull", { title: "Pull model", description: "Pull a model from a registry", inputSchema: { name: z.string() }, }, async ({ name }) => { try { const result = await ollama.pull({ model: name }); return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] }; } catch (error) { return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true }; } } );