MCP Ollama
A Model Context Protocol (MCP) server for integrating Ollama with Claude Desktop or other MCP clients.
Requirements
- Python 3.10 or higher
- Ollama installed and running (https://ollama.com/download)
- At least one model pulled with Ollama (e.g.,
ollama pull llama2
)
Configure Claude Desktop
Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json
on macOS, %APPDATA%\Claude\claude_desktop_config.json
on Windows):
Development
Install in development mode:
Test with MCP Inspector:
Features
The server provides four main tools:
list_models
- List all downloaded Ollama modelsshow_model
- Get detailed information about a specific modelask_model
- Ask a question to a specified model
License
MIT
local-only server
The server can only run on the client's local machine because it depends on local resources.
MCP Ollama 서버는 Ollama 모델을 MCP 클라이언트와 통합하여 사용자가 모델을 나열하고, 자세한 정보를 얻고, 질문을 통해 사용자와 상호 작용할 수 있도록 합니다.
Related Resources
Related MCP Servers
- -securityFlicense-qualityAn interactive chat interface that combines Ollama's LLM capabilities with PostgreSQL database access through the Model Context Protocol (MCP). Ask questions about your data in natural language and get AI-powered responses backed by real SQL queries.Last updated -57
- -securityAlicense-qualityEnables seamless integration between Ollama's local LLM models and MCP-compatible applications, supporting model management and chat interactions.Last updated -6998AGPL 3.0
- AsecurityAlicenseAqualityAn MCP server that queries multiple Ollama models and combines their responses, providing diverse AI perspectives on a single question for more comprehensive answers.Last updated -262MIT License
- -securityFlicense-qualityA server that enables seamless integration between local Ollama LLM instances and MCP-compatible applications, providing advanced task decomposition, evaluation, and workflow management capabilities.Last updated -6