Skip to main content
Glama

Ollama MCP Server

by etnlbck

Ollama MCP Server

An MCP (Model Context Protocol) server that provides tools to interact with Ollama models running on your local machine.

Features

  • List Models: Get all available Ollama models

  • Chat: Interactive chat with conversation history

  • Generate: Single prompt generation

  • Pull Models: Download new models from Ollama registry

  • Delete Models: Remove models from local installation

Prerequisites

  • Ollama installed and running locally

  • Node.js 18+ and npm

Installation

  1. Clone or download this repository

  2. Install dependencies:

npm install
  1. Build the project:

npm run build

Usage

Development Mode

npm run dev

Production Mode

npm run build npm start

Using with Claude Desktop

Add this server to your Claude Desktop configuration:

{ "mcpServers": { "ollama": { "command": "node", "args": ["/path/to/ollama-mcp/dist/index.js"] } } }

Using with Cursor

If you're using Cursor, add the server to your MCP configuration file at ~/.cursor/mcp/config.json:

{ "mcpServers": { "ollama": { "command": "node", "args": ["/path/to/ollama-mcp/dist/index.js"] } } }

Alternatively, you can copy the ready-made config shipped with this repo:

mkdir -p ~/.cursor/mcp cp /path/to/ollama-mcp/mcp.config.json ~/.cursor/mcp/config.json

Available Tools

ollama_list_models

Lists all available Ollama models on your system.

ollama_chat

Chat with a model using conversation history.

  • model: Name of the Ollama model

  • messages: Array of message objects with role ('system', 'user', 'assistant') and content

ollama_generate

Generate a response from a single prompt.

  • model: Name of the Ollama model

  • prompt: The input prompt

ollama_pull_model

Download a model from the Ollama registry.

  • model: Name of the model to download

ollama_delete_model

Remove a model from your local installation.

  • model: Name of the model to delete

Configuration

Set the OLLAMA_BASE_URL environment variable to change the Ollama server URL (default: http://localhost:11434).

License

MIT

-
security - not tested
F
license - not found
-
quality - not tested

local-only server

The server can only run on the client's local machine because it depends on local resources.

Enables interaction with locally running Ollama models through chat, generation, and model management operations. Supports listing, downloading, and deleting models while maintaining conversation history for interactive sessions.

  1. Features
    1. Prerequisites
      1. Installation
        1. Usage
          1. Development Mode
          2. Production Mode
          3. Using with Claude Desktop
          4. Using with Cursor
        2. Available Tools
          1. ollama_list_models
          2. ollama_chat
          3. ollama_generate
          4. ollama_pull_model
          5. ollama_delete_model
        3. Configuration
          1. License

            MCP directory API

            We provide all the information about MCP servers via our MCP API.

            curl -X GET 'https://glama.ai/api/mcp/v1/servers/etnlbck/ollama-mcp'

            If you have feedback or need assistance with the MCP directory API, please join our Discord server