MCP Ollama Server
local-only server
The server can only run on the client’s local machine because it depends on local resources.
Integrations
Allows integration with Ollama, enabling use of Ollama models through the MCP interface. Provides capabilities to list models, get model details, and ask questions to Ollama models.
MCP Ollama
A Model Context Protocol (MCP) server for integrating Ollama with Claude Desktop or other MCP clients.
Requirements
- Python 3.10 or higher
- Ollama installed and running (https://ollama.com/download)
- At least one model pulled with Ollama (e.g.,
ollama pull llama2
)
Configure Claude Desktop
Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json
on macOS, %APPDATA%\Claude\claude_desktop_config.json
on Windows):
Development
Install in development mode:
Test with MCP Inspector:
Features
The server provides four main tools:
list_models
- List all downloaded Ollama modelsshow_model
- Get detailed information about a specific modelask_model
- Ask a question to a specified model
License
MIT
You must be authenticated.
MCP Ollama server integrates Ollama models with MCP clients, allowing users to list models, get detailed information, and interact with them through questions.