Ollama MCP Server

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
OLLAMA_HOSTNoCustom Ollama API endpointhttp://127.0.0.1:11434

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription
serve

Start Ollama server

create

Create a model from a Modelfile

show

Show information for a model

run

Run a model

pull

Pull a model from a registry

push

Push a model to a registry

list

List models

cp

Copy a model

rm

Remove a model

chat_completion

OpenAI-compatible chat completion API