Skip to main content
Glama

Ollama MCP Server

by hyzhak

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
OLLAMA_HOSTNoCustom Ollama API endpointhttp://127.0.0.1:11434

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription
list

List all models in Ollama

show

Show information for a model

create

Create a model from a base model (remote only, no Modelfile support)

pull

Pull a model from a registry

push

Push a model to a registry

cp

Copy a model

rm

Remove a model

run

Run a model with a prompt. Optionally accepts an image file path for vision/multimodal models and a temperature parameter.

chat_completion

OpenAI-compatible chat completion API. Supports optional images per message for vision/multimodal models.

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/hyzhak/ollama-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server