Skip to main content
Glama
hyzhak
by hyzhak

create

Generate a custom AI model by specifying a unique name and selecting a base model, enabling tailored AI solutions within the Ollama MCP Server environment.

Instructions

Create a model from a base model (remote only, no Modelfile support)

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
fromYes
nameYes

Implementation Reference

  • The handler function for the 'create' tool that executes the logic: calls ollama.create with model name and source model, returns JSON-formatted result or error.
    async ({ name, from }) => { try { const result = await ollama.create({ model: name, from }); return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] }; } catch (error) { return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true }; } }
  • Zod input schema for the 'create' tool, defining required string parameters 'name' and 'from'.
    inputSchema: { name: z.string(), from: z.string() },
  • src/index.ts:65-80 (registration)
    Registration of the 'create' MCP tool via server.registerTool, specifying name, metadata, input schema, and handler function.
    server.registerTool( "create", { title: "Create model (remote only supports 'from')", description: "Create a model from a base model (remote only, no Modelfile support)", inputSchema: { name: z.string(), from: z.string() }, }, async ({ name, from }) => { try { const result = await ollama.create({ model: name, from }); return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] }; } catch (error) { return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true }; } } );

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hyzhak/ollama-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server