Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
PORTNoServer port3456
OLLAMA_APINoOllama API endpointhttp://localhost:11434

Tools

Functions exposed to the LLM to take actions

NameDescription
ollama_chat

Chat with a model using conversation messages. Supports system messages, multi-turn conversations, tool calling, and generation options.

ollama_copy

Copy a model. Creates a duplicate of an existing model with a new name.

ollama_create

Create a new model with structured parameters. Allows customization of model behavior, system prompts, and templates.

ollama_delete

Delete a model from local storage. Removes the model and frees up disk space.

ollama_embed

Generate embeddings for text input. Returns numerical vector representations.

ollama_generate

Generate completion from a prompt. Simpler than chat, useful for single-turn completions.

ollama_list

List all available Ollama models installed locally. Returns model names, sizes, and modification dates.

ollama_ps

List running models. Shows which models are currently loaded in memory.

ollama_pull

Pull a model from the Ollama registry. Downloads the model to make it available locally.

ollama_push

Push a model to the Ollama registry. Uploads a local model to make it available remotely.

ollama_show

Show detailed information about a specific model including modelfile, parameters, and architecture details.

ollama_web_fetch

Fetch a web page by URL using Ollama's web fetch API. Returns the page title, content, and links. Requires OLLAMA_API_KEY environment variable.

ollama_web_search

Perform a web search using Ollama's web search API. Augments models with latest information to reduce hallucinations. Requires OLLAMA_API_KEY environment variable.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/rawveg/ollama-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server