Ollama MCP Server
Ollama MCP Server
An MCP (Model Context Protocol) server for Ollama that enables seamless integration between Ollama's local LLM models and MCP-compatible applications like Claude Desktop.
Features
- List available Ollama models
- Pull new models from Ollama
- Chat with models using Ollama's chat API
- Get detailed model information
- Automatic port management
- Environment variable configuration
Prerequisites
- Node.js (v16 or higher)
- npm
- Ollama installed and running locally
Installation
Installing via Smithery
To install Ollama MCP Server for Claude Desktop automatically via Smithery:
Manual Installation
Install globally via npm:
Installing in Other MCP Applications
To install the Ollama MCP Server in other MCP-compatible applications (like Cline or Claude Desktop), add the following configuration to your application's MCP settings file:
The settings file location varies by application:
- Claude Desktop:
claude_desktop_config.json
in the Claude app data directory - Cline:
cline_mcp_settings.json
in the VS Code global storage
Usage
Starting the Server
Simply run:
The server will start on port 3456 by default. You can specify a different port using the PORT environment variable:
Environment Variables
PORT
: Server port (default: 3456). Can be used both when running directly and during Smithery installation:CopyOLLAMA_API
: Ollama API endpoint (default: http://localhost:11434)
API Endpoints
GET /models
- List available modelsPOST /models/pull
- Pull a new modelPOST /chat
- Chat with a modelGET /models/:name
- Get model details
Development
- Clone the repository:
- Install dependencies:
- Build the project:
- Start the server:
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
MIT
Related
This server cannot be installed
Enables seamless integration between Ollama's local LLM models and MCP-compatible applications, supporting model management and chat interactions.