Skip to main content
Glama
hyzhak

Ollama MCP Server

by hyzhak

cp

Copy Ollama AI models between locations to duplicate, back up, or transfer models within the local environment.

Instructions

Copy a model

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
sourceYes
destinationYes

Implementation Reference

  • Handler function that copies an Ollama model from source to destination using ollama.copy() and returns the result or error.
    async ({ source, destination }) => { try { const result = await ollama.copy({ source, destination }); return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] }; } catch (error) { return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true }; } }
  • Input schema for the 'cp' tool, defining required 'source' and 'destination' parameters as strings.
    { title: "Copy model", description: "Copy a model", inputSchema: { source: z.string(), destination: z.string() }, },
  • src/index.ts:119-134 (registration)
    Registers the 'cp' tool with the MCP server, including schema and inline handler.
    server.registerTool( "cp", { title: "Copy model", description: "Copy a model", inputSchema: { source: z.string(), destination: z.string() }, }, async ({ source, destination }) => { try { const result = await ollama.copy({ source, destination }); return { content: [{ type: "text", text: JSON.stringify(result, null, 2) }] }; } catch (error) { return { content: [{ type: "text", text: `Error: ${formatError(error)}` }], isError: true }; } } );
Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hyzhak/ollama-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server