Skip to main content
Glama

Ollama MCP Server

A bridge to use Ollama as an MCP server from Claude Code.

日本語版 README はこちら

Features

  • ollama_generate: Single-turn text generation (supports vision models with image input)

  • ollama_chat: Multi-turn chat conversations (supports vision models with image input)

  • ollama_list: List available models

  • ollama_show: Show model details

  • ollama_pull: Download models

  • ollama_embeddings: Generate text embeddings

Supported Vision Models

  • llava - General-purpose vision model

  • llama3.2-vision - Meta's multimodal model

  • deepseek-ocr - OCR-specialized vision model

Prerequisites

  1. Ollama installed and running

    # Install Ollama (macOS) brew install ollama # Start Ollama server ollama serve
  2. At least one model downloaded

    ollama pull llama3.2

Installation

cd ollama-mcp-server npm install npm run build

Claude Code Configuration

# Add to local scope (current project) claude mcp add --transport stdio ollama -- node /path/to/ollama-mcp-server/dist/index.js # Add to user scope (all projects) claude mcp add --transport stdio ollama --scope user -- node /path/to/ollama-mcp-server/dist/index.js

To add environment variables:

claude mcp add --transport stdio ollama \ --env OLLAMA_BASE_URL=http://localhost:11434 \ -- node /path/to/ollama-mcp-server/dist/index.js

Method 2: Manual Configuration

Project scope (.mcp.json in project root):

{ "mcpServers": { "ollama": { "command": "node", "args": ["/path/to/ollama-mcp-server/dist/index.js"], "env": { "OLLAMA_BASE_URL": "http://localhost:11434" } } } }

User scope (~/.claude.json):

{ "mcpServers": { "ollama": { "command": "node", "args": ["/path/to/ollama-mcp-server/dist/index.js"], "env": { "OLLAMA_BASE_URL": "http://localhost:11434" } } } }

Verify Installation

# List configured MCP servers claude mcp list # Inside Claude Code /mcp

Auto-approve Tool Calls (Optional)

By default, Claude Code asks for confirmation each time an Ollama tool is called. To skip confirmations, add the following to ~/.claude/settings.json:

{ "permissions": { "allow": [ "mcp__ollama__ollama_generate", "mcp__ollama__ollama_chat", "mcp__ollama__ollama_list", "mcp__ollama__ollama_show", "mcp__ollama__ollama_pull", "mcp__ollama__ollama_embeddings" ] } }

Environment Variables

Variable

Default

Description

OLLAMA_BASE_URL

http://localhost:11434

Ollama server URL

Usage Examples

From Claude Code:

List Models

List available Ollama models

Text Generation

Generate "3 features of Rust" using Ollama's llama3.2 model

Chat

I'd like to have Ollama do a code review

Vision / Image Analysis

Analyze this image using llava: /path/to/image.jpg
Use deepseek-ocr to extract text from this document: /path/to/document.png

Troubleshooting

Cannot connect to Ollama

# Check if Ollama is running curl http://localhost:11434/api/tags # If not running ollama serve

No models available

ollama pull llama3.2

MCP server not showing up

# Verify server is registered claude mcp list # Check server health claude mcp get ollama

License

MIT

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ngc-shj/ollama-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server