Ollama MCP Server
by NightTrek
Server Configuration
Describes the environment variables required to run the server.
Name | Required | Description | Default |
---|---|---|---|
OLLAMA_HOST | No | Custom Ollama API endpoint | http://127.0.0.1:11434 |
Schema
Prompts
Interactive templates invoked by user choice
Name | Description |
---|---|
No prompts |
Resources
Contextual data attached and managed by the client
Name | Description |
---|---|
No resources |
Tools
Functions exposed to the LLM to take actions
Name | Description |
---|---|
serve | Start Ollama server |
create | Create a model from a Modelfile |
show | Show information for a model |
run | Run a model |
pull | Pull a model from a registry |
push | Push a model to a registry |
list | List models |
cp | Copy a model |
rm | Remove a model |
chat_completion | OpenAI-compatible chat completion API |