create
Generate custom AI models locally by defining parameters in a Modelfile, enabling tailored machine learning solutions for specific applications.
Instructions
Create a model from a Modelfile
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| name | Yes | Name for the model | |
| modelfile | Yes | Path to Modelfile |
Implementation Reference
- src/index.ts:308-322 (handler)The function that executes the 'create' tool by running the 'ollama create' command using the provided model name and Modelfile path.private async handleCreate(args: any) { try { const { stdout, stderr } = await execAsync(`ollama create ${args.name} -f ${args.modelfile}`); return { content: [ { type: 'text', text: stdout || stderr, }, ], }; } catch (error) { throw new McpError(ErrorCode.InternalError, `Failed to create model: ${formatError(error)}`); } }
- src/index.ts:79-93 (schema)Input schema defining the required 'name' and 'modelfile' parameters for the 'create' tool.inputSchema: { type: 'object', properties: { name: { type: 'string', description: 'Name for the model', }, modelfile: { type: 'string', description: 'Path to Modelfile', }, }, required: ['name', 'modelfile'], additionalProperties: false, },
- src/index.ts:76-94 (registration)Registration of the 'create' tool in the ListTools response, including name, description, and input schema.{ name: 'create', description: 'Create a model from a Modelfile', inputSchema: { type: 'object', properties: { name: { type: 'string', description: 'Name for the model', }, modelfile: { type: 'string', description: 'Path to Modelfile', }, }, required: ['name', 'modelfile'], additionalProperties: false, }, },
- src/index.ts:258-259 (registration)Dispatch in CallToolRequestHandler that routes 'create' tool calls to the handleCreate method.case 'create': return await this.handleCreate(request.params.arguments);