Provides tools for interacting with a local Ollama instance, enabling text generation, multi-turn chat, vision-based image analysis, model management (listing, showing details, pulling), and text embedding generation.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Ollama MCP Serverlist my available local models"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Ollama MCP Server
A bridge to use Ollama as an MCP server from Claude Code.
Features
ollama_generate: Single-turn text generation (supports vision models with image input)
ollama_chat: Multi-turn chat conversations (supports vision models with image input)
ollama_list: List available models
ollama_show: Show model details
ollama_pull: Download models
ollama_embeddings: Generate text embeddings
Supported Vision Models
llava- General-purpose vision modelllama3.2-vision- Meta's multimodal modeldeepseek-ocr- OCR-specialized vision model
Prerequisites
Ollama installed and running
# Install Ollama (macOS) brew install ollama # Start Ollama server ollama serveAt least one model downloaded
ollama pull llama3.2
Installation
cd ollama-mcp-server
npm install
npm run buildClaude Code Configuration
Method 1: Using CLI (Recommended)
# Add to local scope (current project)
claude mcp add --transport stdio ollama -- node /path/to/ollama-mcp-server/dist/index.js
# Add to user scope (all projects)
claude mcp add --transport stdio ollama --scope user -- node /path/to/ollama-mcp-server/dist/index.jsTo add environment variables:
claude mcp add --transport stdio ollama \
--env OLLAMA_BASE_URL=http://localhost:11434 \
-- node /path/to/ollama-mcp-server/dist/index.jsMethod 2: Manual Configuration
Project scope (.mcp.json in project root):
{
"mcpServers": {
"ollama": {
"command": "node",
"args": ["/path/to/ollama-mcp-server/dist/index.js"],
"env": {
"OLLAMA_BASE_URL": "http://localhost:11434"
}
}
}
}User scope (~/.claude.json):
{
"mcpServers": {
"ollama": {
"command": "node",
"args": ["/path/to/ollama-mcp-server/dist/index.js"],
"env": {
"OLLAMA_BASE_URL": "http://localhost:11434"
}
}
}
}Verify Installation
# List configured MCP servers
claude mcp list
# Inside Claude Code
/mcpAuto-approve Tool Calls (Optional)
By default, Claude Code asks for confirmation each time an Ollama tool is called. To skip confirmations, add the following to ~/.claude/settings.json:
{
"permissions": {
"allow": [
"mcp__ollama__ollama_generate",
"mcp__ollama__ollama_chat",
"mcp__ollama__ollama_list",
"mcp__ollama__ollama_show",
"mcp__ollama__ollama_pull",
"mcp__ollama__ollama_embeddings"
]
}
}Environment Variables
Variable | Default | Description |
|
| Ollama server URL |
Usage Examples
From Claude Code:
List Models
List available Ollama modelsText Generation
Generate "3 features of Rust" using Ollama's llama3.2 modelChat
I'd like to have Ollama do a code reviewVision / Image Analysis
Analyze this image using llava: /path/to/image.jpgUse deepseek-ocr to extract text from this document: /path/to/document.pngTroubleshooting
Cannot connect to Ollama
# Check if Ollama is running
curl http://localhost:11434/api/tags
# If not running
ollama serveNo models available
ollama pull llama3.2MCP server not showing up
# Verify server is registered
claude mcp list
# Check server health
claude mcp get ollamaLicense
MIT
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.