Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
ollama_urlNoOverride the default Ollama endpoint. Set if you run Ollama on a different host or port.http://localhost:11434

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{}

Tools

Functions exposed to the LLM to take actions

NameDescription
ollama_statusA

Health check: whether the Ollama server is reachable and its version. Use this as a precondition before other tools if you're unsure whether Ollama is running.

list_modelsA

List locally-installed models: name, size in bytes, digest, modified timestamp, family (e.g. llama), parameter size (e.g. 8.0B), and quantization level (e.g. Q4_K_M).

list_runningA

List models currently loaded into VRAM with their size, VRAM footprint, and expiry timestamp. Empty list means Ollama is idle.

show_modelA

Show detailed information for a specific model: modelfile excerpt, parameters, template, capabilities, architecture details, quantization level.

generateA

Run a one-shot text completion against a local model (non-streaming). Returns the full response text plus timing and tokens/second.

chatA

Run a chat completion against a local model with message history (non-streaming). Returns the assistant's reply plus timing.

pull_modelA

Download a model from the Ollama registry. Blocks until complete — can take a long time for multi-GB models. For very large pulls, prefer ollama pull in a terminal where you can watch progress.

delete_modelA

Delete a locally-installed model. Does not affect the remote registry copy. Free the disk space of a model you no longer need.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/LukeLamb/claude-ollama-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server