Integrations
Provides access to Ollama's local LLM models through a Model Context Protocol server, allowing listing, pulling, and chatting with Ollama models
Ollama MCP Server
An MCP (Model Context Protocol) server for Ollama that enables seamless integration between Ollama's local LLM models and MCP-compatible applications like Claude Desktop.
Features
- List available Ollama models
- Pull new models from Ollama
- Chat with models using Ollama's chat API
- Get detailed model information
- Automatic port management
- Environment variable configuration
Prerequisites
- Node.js (v16 or higher)
- npm
- Ollama installed and running locally
Installation
Manual Installation
Install globally via npm:
Installing in Other MCP Applications
To install the Ollama MCP Server in other MCP-compatible applications (like Cline or Claude Desktop), add the following configuration to your application's MCP settings file:
The settings file location varies by application:
- Claude Desktop:
claude_desktop_config.json
in the Claude app data directory - Cline:
cline_mcp_settings.json
in the VS Code global storage
Usage
Starting the Server
Simply run:
The server will start on port 3456 by default. You can specify a different port using the PORT environment variable:
Environment Variables
PORT
: Server port (default: 3456). Can be used when running directly:CopyOLLAMA_API
: Ollama API endpoint (default: http://localhost:11434)
API Endpoints
GET /models
- List available modelsPOST /models/pull
- Pull a new modelPOST /chat
- Chat with a modelGET /models/:name
- Get model details
Development
- Clone the repository:
- Install dependencies:
- Build the project:
- Start the server:
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
However, this does not grant permission to incorporate this project into third-party services or commercial platforms without prior discussion and agreement. While I previously accepted contributions (such as a Dockerfile and related README updates) to support integration with services like Smithery, recent actions by a similar service — Glama — have required a reassessment of this policy.
Glama has chosen to include open-source MCP projects in their commercial offering without notice or consent, and subsequently created issue requests asking maintainers to perform unpaid work to ensure compatibility with their platform. This behaviour — leveraging community labour for profit without dialogue or compensation — is not only inconsiderate, but ethically problematic.
As a result, and to protect the integrity of this project and its contributors, the licence has been updated to the GNU Affero General Public License v3.0 (AGPL-3.0). This change ensures that any use of the software — particularly in commercial or service-based platforms — must remain fully compliant with the AGPL's terms and obtain a separate commercial licence. Merely linking to the original source is not sufficient where the project is being actively monetised. If you wish to include this project in a commercial offering, please get in touch first to discuss licensing terms.
License
AGPL v3.0
Related
This project was previously MIT-licensed. As of 20th April 2025, it is now licensed under AGPL-3.0 to prevent unauthorised commercial exploitation. If your use of this project predates this change, please refer to the relevant Git tag or commit for the applicable licence.
This server cannot be installed
local-only server
The server can only run on the client's local machine because it depends on local resources.
Enables seamless integration between Ollama's local LLM models and MCP-compatible applications, supporting model management and chat interactions.
Related Resources
Related MCP Servers
- -securityFlicense-qualityAn interactive chat interface that combines Ollama's LLM capabilities with PostgreSQL database access through the Model Context Protocol (MCP). Ask questions about your data in natural language and get AI-powered responses backed by real SQL queries.Last updated -28TypeScript
- AsecurityAlicenseAqualityMCP Ollama server integrates Ollama models with MCP clients, allowing users to list models, get detailed information, and interact with them through questions.Last updated -312PythonMIT License
- AsecurityFlicenseAqualityA bridge that enables seamless integration of Ollama's local LLM capabilities into MCP-powered applications, allowing users to manage and run AI models locally with full API coverage.Last updated -1033JavaScript
- -securityFlicense-qualityA generic Model Context Protocol framework for building AI-powered applications that provides standardized ways to create MCP servers and clients for integrating LLMs with support for Ollama and Supabase.Last updated -TypeScript