Skip to main content
Glama

ollama_create

Create custom AI models by specifying base models, system prompts, and templates to tailor model behavior for specific use cases.

Instructions

Create a new model with structured parameters. Allows customization of model behavior, system prompts, and templates.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
modelYesName for the new model
fromYesBase model to derive from (e.g., llama2, llama3)
systemNoSystem prompt for the model
templateNoPrompt template to use
licenseNoLicense for the model
formatNojson

Implementation Reference

  • Core handler function that executes the ollama.create API call to create a new model.
    export async function createModel( ollama: Ollama, options: CreateModelOptions, format: ResponseFormat ): Promise<string> { const response = await ollama.create({ model: options.model, from: options.from, system: options.system, template: options.template, license: options.license, stream: false, }); return formatResponse(JSON.stringify(response), format); }
  • Zod schema used to validate and parse the input arguments for the ollama_create tool.
    export const CreateModelInputSchema = z.object({ model: z.string().min(1), from: z.string().min(1), system: z.string().optional(), template: z.string().optional(), license: z.string().optional(), format: ResponseFormatSchema.default('json'), });
  • Tool definition object that registers the 'ollama_create' tool with the MCP server via autoloader discovery.
    export const toolDefinition: ToolDefinition = { name: 'ollama_create', description: 'Create a new model with structured parameters. Allows customization of model behavior, system prompts, and templates.', inputSchema: { type: 'object', properties: { model: { type: 'string', description: 'Name for the new model', }, from: { type: 'string', description: 'Base model to derive from (e.g., llama2, llama3)', }, system: { type: 'string', description: 'System prompt for the model', }, template: { type: 'string', description: 'Prompt template to use', }, license: { type: 'string', description: 'License for the model', }, format: { type: 'string', enum: ['json', 'markdown'], default: 'json', }, }, required: ['model', 'from'], }, handler: async (ollama: Ollama, args: Record<string, unknown>, format: ResponseFormat) => { const validated = CreateModelInputSchema.parse(args); return createModel( ollama, { model: validated.model, from: validated.from, system: validated.system, template: validated.template, license: validated.license, }, format ); }, };

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/rawveg/ollama-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server