Skip to main content
Glama
hyzhak
by hyzhak

create

Create custom AI models from existing base models using the Ollama MCP Server to run and manage local AI with privacy.

Instructions

Create a model from a base model (remote only, no Modelfile support)

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
nameYes
fromYes

Implementation Reference

  • The handler function for the "create" tool. It takes 'name' and 'from' parameters, calls ollama.create to create a new model from the base model specified in 'from', and returns the JSON result or an error message.
    async ({ name, from }) => { try { const result = await ollama.create({ model: name, from }); return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] }; } catch (error) { return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true }; } }
  • Input schema for the "create" tool, defining required string parameters 'name' (new model name) and 'from' (base model).
    inputSchema: { name: z.string(), from: z.string() },
  • src/index.ts:65-80 (registration)
    Registration of the "create" tool using server.registerTool, including title, description, input schema, and handler function.
    server.registerTool( "create", { title: "Create model (remote only supports 'from')", description: "Create a model from a base model (remote only, no Modelfile support)", inputSchema: { name: z.string(), from: z.string() }, }, async ({ name, from }) => { try { const result = await ollama.create({ model: name, from }); return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] }; } catch (error) { return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true }; } } );

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hyzhak/ollama-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server