Skip to main content
Glama

MCP Ollama

A Model Context Protocol (MCP) server for integrating Ollama with Claude Desktop or other MCP clients.

Requirements

  • Python 3.10 or higher

  • Ollama installed and running (https://ollama.com/download)

  • At least one model pulled with Ollama (e.g., ollama pull llama2)

Configure Claude Desktop

Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json on macOS, %APPDATA%\Claude\claude_desktop_config.json on Windows):

{ "mcpServers": { "ollama": { "command": "uvx", "args": [ "mcp-ollama" ] } } }

Development

Install in development mode:

git clone https://github.com/yourusername/mcp-ollama.git cd mcp-ollama uv sync

Test with MCP Inspector:

mcp dev src/mcp_ollama/server.py

Related MCP server: Ollama MCP Server

Features

The server provides four main tools:

  • list_models - List all downloaded Ollama models

  • show_model - Get detailed information about a specific model

  • ask_model - Ask a question to a specified model

License

MIT

One-click Deploy
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/emgeee/mcp-ollama'

If you have feedback or need assistance with the MCP directory API, please join our Discord server