Ollama MCP Server

Ollama MCP Server

An MCP (Model Context Protocol) server for Ollama that enables seamless integration between Ollama's local LLM models and MCP-compatible applications like Claude Desktop.

Features

  • List available Ollama models
  • Pull new models from Ollama
  • Chat with models using Ollama's chat API
  • Get detailed model information
  • Automatic port management
  • Environment variable configuration

Prerequisites

  • Node.js (v16 or higher)
  • npm
  • Ollama installed and running locally

Installation

Installing via Smithery

To install Ollama MCP Server for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install @rawveg/ollama-mcp --client claude

Manual Installation

Install globally via npm:

npm install -g @rawveg/ollama-mcp

Installing in Other MCP Applications

To install the Ollama MCP Server in other MCP-compatible applications (like Cline or Claude Desktop), add the following configuration to your application's MCP settings file:

{ "mcpServers": { "@rawveg/ollama-mcp": { "command": "npx", "args": [ "-y", "@rawveg/ollama-mcp" ] } } }

The settings file location varies by application:

  • Claude Desktop: claude_desktop_config.json in the Claude app data directory
  • Cline: cline_mcp_settings.json in the VS Code global storage

Usage

Starting the Server

Simply run:

ollama-mcp

The server will start on port 3456 by default. You can specify a different port using the PORT environment variable:

PORT=3457 ollama-mcp

Environment Variables

  • PORT: Server port (default: 3456). Can be used both when running directly and during Smithery installation:
    # When running directly PORT=3457 ollama-mcp # When installing via Smithery PORT=3457 npx -y @smithery/cli install @rawveg/ollama-mcp --client claude
  • OLLAMA_API: Ollama API endpoint (default: http://localhost:11434)

API Endpoints

  • GET /models - List available models
  • POST /models/pull - Pull a new model
  • POST /chat - Chat with a model
  • GET /models/:name - Get model details

Development

  1. Clone the repository:
git clone https://github.com/rawveg/ollama-mcp.git cd ollama-mcp
  1. Install dependencies:
npm install
  1. Build the project:
npm run build
  1. Start the server:
npm start

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT

-
security - not tested
A
license - permissive license
-
quality - not tested

Enables seamless integration between Ollama's local LLM models and MCP-compatible applications, supporting model management and chat interactions.

  1. Features
    1. Prerequisites
      1. Installation
        1. Installing via Smithery
          1. Manual Installation
            1. Installing in Other MCP Applications
            2. Usage
              1. Starting the Server
                1. Environment Variables
                  1. API Endpoints
                  2. Development
                    1. Contributing
                      1. License
                        1. Related