Skip to main content
Glama

MCP Ollama Server

MCP 奥拉马

用于将 Ollama 与 Claude Desktop 或其他 MCP 客户端集成的模型上下文协议 (MCP) 服务器。

要求

  • Python 3.10 或更高版本
  • Ollama 已安装并正在运行( https://ollama.com/download
  • 至少一个模型使用 Ollama 拉取(例如, ollama pull llama2

配置 Claude 桌面

添加到您的 Claude Desktop 配置(在 macOS 上为~/Library/Application Support/Claude/claude_desktop_config.json ,在 Windows 上%APPDATA%\Claude\claude_desktop_config.json ):

{ "mcpServers": { "ollama": { "command": "uvx", "args": [ "mcp-ollama" ] } } }

发展

以开发模式安装:

git clone https://github.com/yourusername/mcp-ollama.git cd mcp-ollama uv sync

使用 MCP Inspector 进行测试:

mcp dev src/mcp_ollama/server.py

特征

该服务器提供四个主要工具:

  • list_models - 列出所有下载的 Ollama 模型
  • show_model - 获取有关特定模型的详细信息
  • ask_model - 向指定模型提问

执照

麻省理工学院

You must be authenticated.

A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

local-only server

The server can only run on the client's local machine because it depends on local resources.

MCP Ollama 服务器将 Ollama 模型与 MCP 客户端集成,允许用户列出模型、获取详细信息并通过问题与模型进行交互。

  1. 要求
    1. 配置 Claude 桌面
    2. 发展
  2. 特征
    1. 执照

      Related MCP Servers

      • -
        security
        A
        license
        -
        quality
        A Model Context Protocol (MCP) server that lets you seamlessly use OpenAI's models right from Claude.
        Last updated -
        1
        24
        28
        JavaScript
        MIT License
        • Apple
      • -
        security
        A
        license
        -
        quality
        A simple MCP server for interacting with OpenAI assistants. This server allows other tools (like Claude Desktop) to create and interact with OpenAI assistants through the Model Context Protocol.
        Last updated -
        26
        Python
        MIT License
        • Apple
      • -
        security
        F
        license
        -
        quality
        An interactive chat interface that combines Ollama's LLM capabilities with PostgreSQL database access through the Model Context Protocol (MCP). Ask questions about your data in natural language and get AI-powered responses backed by real SQL queries.
        Last updated -
        28
        TypeScript
      • -
        security
        A
        license
        -
        quality
        Enables seamless integration between Ollama's local LLM models and MCP-compatible applications, supporting model management and chat interactions.
        Last updated -
        165
        47
        TypeScript
        AGPL 3.0

      View all related MCP servers

      MCP directory API

      We provide all the information about MCP servers via our MCP API.

      curl -X GET 'https://glama.ai/api/mcp/v1/servers/emgeee/mcp-ollama'

      If you have feedback or need assistance with the MCP directory API, please join our Discord server