MCP Ollama
A Model Context Protocol (MCP) server for integrating Ollama with Claude Desktop or other MCP clients.
Requirements
- Python 3.10 or higher
- Ollama installed and running (https://ollama.com/download)
- At least one model pulled with Ollama (e.g.,
ollama pull llama2
)
Configure Claude Desktop
Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json
on macOS, %APPDATA%\Claude\claude_desktop_config.json
on Windows):
Development
Install in development mode:
Test with MCP Inspector:
Features
The server provides four main tools:
list_models
- List all downloaded Ollama modelsshow_model
- Get detailed information about a specific modelask_model
- Ask a question to a specified model
License
MIT
local-only server
The server can only run on the client's local machine because it depends on local resources.
MCP Ollama 服务器将 Ollama 模型与 MCP 客户端集成,允许用户列出模型、获取详细信息并通过问题与模型进行交互。
Related Resources
Related MCP Servers
- -securityFlicense-qualityAn interactive chat interface that combines Ollama's LLM capabilities with PostgreSQL database access through the Model Context Protocol (MCP). Ask questions about your data in natural language and get AI-powered responses backed by real SQL queries.Last updated -57
- -securityAlicense-qualityEnables seamless integration between Ollama's local LLM models and MCP-compatible applications, supporting model management and chat interactions.Last updated -6998AGPL 3.0
- AsecurityAlicenseAqualityAn MCP server that queries multiple Ollama models and combines their responses, providing diverse AI perspectives on a single question for more comprehensive answers.Last updated -262MIT License
- -securityFlicense-qualityA server that enables seamless integration between local Ollama LLM instances and MCP-compatible applications, providing advanced task decomposition, evaluation, and workflow management capabilities.Last updated -6