Skip to main content
Glama

mcp-ollama

MCP server wrapping local Ollama models for offload from API-priced orchestrators.

Exposes 9 tools that pass work to a local model (text generation, summarisation, code tasks, mechanical transforms, commit/PR/changelog drafting). The orchestrator decides what to route locally; this server does the routing.

  • Transport: stdio

  • Runtime: Node 18+

  • Default model: hermes3:8b (override via OLLAMA_MODEL)

  • Ollama host: http://localhost:11434 (override via OLLAMA_HOST)

  • License: Apache-2.0

Install

npm install
npm run build

You also need a running Ollama instance with at least one model pulled:

ollama pull hermes3:8b
ollama pull qwen2.5-coder:32b  # optional, for local_code

Run (stdio)

node dist/index.js

Configure Claude Code

claude mcp add --transport stdio ollama -- node /absolute/path/to/mcp-ollama/dist/index.js

Or in ~/.claude/settings.json:

{
  "mcpServers": {
    "ollama": {
      "transport": "stdio",
      "command": "node",
      "args": ["/absolute/path/to/mcp-ollama/dist/index.js"],
      "env": {
        "OLLAMA_HOST": "http://localhost:11434",
        "OLLAMA_MODEL": "hermes3:8b"
      }
    }
  }
}

Tools

Tool

Purpose

local_generate

General-purpose generation with system + user prompt

local_summarize

Summarise a blob of text

local_analyze

Analyse text against a specific question

local_draft

Draft content in a given style

local_code

Code tasks: docstring / test / explain / review / types / refactor-suggest

local_diff

Diff-driven tasks: commit-message / pr-description / changelog / summary / impact

local_transform

Mechanical code transformations

local_models

List models available on the local Ollama host

local_pull

Pull a model onto the local Ollama host

Full tool schemas are exposed over MCP introspection.

Environment variables

Variable

Default

Purpose

OLLAMA_HOST

http://localhost:11434

Ollama HTTP endpoint

OLLAMA_MODEL

hermes3:8b

Default model when a tool call omits model

Why

Orchestrators priced by the token (Claude Code, Cursor, the Anthropic API) pay for every classification, every docstring, every commit message. Most of that work doesn't need Opus or GPT-5. Routed to Ollama on the same machine, the same work is free and faster. mcp-ollama is the routing surface.

Part of ALTER

mcp-ollama is maintained by ALTER as part of the identity infrastructure for the AI economy. The ALTER identity MCP server is hosted at mcp.truealter.com.

-
security - not tested
A
license - permissive license
-
quality - not tested

Resources

Unclaimed servers have limited discoverability.

Looking for Admin?

If you are the server author, to access and configure the admin panel.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/true-alter/mcp-ollama'

If you have feedback or need assistance with the MCP directory API, please join our Discord server