MCP Ollama Server

by emgeee
Verified

local-only server

The server can only run on the client’s local machine because it depends on local resources.

Integrations

  • Allows integration with Ollama, enabling use of Ollama models through the MCP interface. Provides capabilities to list models, get model details, and ask questions to Ollama models.

MCP Ollama

A Model Context Protocol (MCP) server for integrating Ollama with Claude Desktop or other MCP clients.

Requirements

  • Python 3.10 or higher
  • Ollama installed and running (https://ollama.com/download)
  • At least one model pulled with Ollama (e.g., ollama pull llama2)

Configure Claude Desktop

Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json on macOS, %APPDATA%\Claude\claude_desktop_config.json on Windows):

{ "mcpServers": { "ollama": { "command": "uvx", "args": [ "mcp-ollama" ] } } }

Development

Install in development mode:

git clone https://github.com/yourusername/mcp-ollama.git cd mcp-ollama uv sync

Test with MCP Inspector:

mcp dev src/mcp_ollama/server.py

Features

The server provides four main tools:

  • list_models - List all downloaded Ollama models
  • show_model - Get detailed information about a specific model
  • ask_model - Ask a question to a specified model

License

MIT

You must be authenticated.

A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

MCP Ollama server integrates Ollama models with MCP clients, allowing users to list models, get detailed information, and interact with them through questions.

  1. Requirements
    1. Configure Claude Desktop
    2. Development
  2. Features
    1. License