Skip to main content
Glama
NightTrek

Ollama MCP Server

by NightTrek

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault

No arguments

Capabilities

Server capabilities have not been inspected yet.

Tools

Functions exposed to the LLM to take actions

NameDescription
serveB

Start Ollama server

createC

Create a model from a Modelfile

showC

Show information for a model

runC

Run a model

pullC

Pull a model from a registry

pushC

Push a model to a registry

listC

List models

cpC

Copy a model

rmC

Remove a model

chat_completionB

OpenAI-compatible chat completion API

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/NightTrek/Ollama-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server