create
Create custom AI models from existing base models using the Ollama MCP Server to run and manage local AI with privacy.
Instructions
Create a model from a base model (remote only, no Modelfile support)
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| name | Yes | ||
| from | Yes |
Implementation Reference
- src/index.ts:72-79 (handler)The handler function for the "create" tool. It takes 'name' and 'from' parameters, calls ollama.create to create a new model from the base model specified in 'from', and returns the JSON result or an error message.async ({ name, from }) => { try { const result = await ollama.create({ model: name, from }); return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] }; } catch (error) { return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true }; } }
- src/index.ts:70-70 (schema)Input schema for the "create" tool, defining required string parameters 'name' (new model name) and 'from' (base model).inputSchema: { name: z.string(), from: z.string() },
- src/index.ts:65-80 (registration)Registration of the "create" tool using server.registerTool, including title, description, input schema, and handler function.server.registerTool( "create", { title: "Create model (remote only supports 'from')", description: "Create a model from a base model (remote only, no Modelfile support)", inputSchema: { name: z.string(), from: z.string() }, }, async ({ name, from }) => { try { const result = await ollama.create({ model: name, from }); return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] }; } catch (error) { return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true }; } } );