Skip to main content
Glama

Ollama MCP Server

by NightTrek

create

Generate and configure AI models locally using a Modelfile. Specify the model name and Modelfile path to integrate Ollama's LLM capabilities into MCP-powered applications.

Instructions

Create a model from a Modelfile

Input Schema

NameRequiredDescriptionDefault
modelfileYesPath to Modelfile
nameYesName for the model

Input Schema (JSON Schema)

{ "additionalProperties": false, "properties": { "modelfile": { "description": "Path to Modelfile", "type": "string" }, "name": { "description": "Name for the model", "type": "string" } }, "required": [ "name", "modelfile" ], "type": "object" }

You must be authenticated.

Other Tools from Ollama MCP Server

Related Tools

  • @NightTrek/Ollama-mcp
  • @NightTrek/Ollama-mcp
  • @NightTrek/Ollama-mcp
  • @NightTrek/Ollama-mcp
  • @NightTrek/Ollama-mcp
  • @NightTrek/Ollama-mcp

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/NightTrek/Ollama-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server