Enables consulting with Ollama models for alternative reasoning viewpoints, with tools for sending prompts to models and listing available models on the Ollama instance.
MCP Ollama Consult Server
An MCP (Model Context Protocol) server that allows consulting with Ollama models for reasoning from alternative viewpoints.
Features
consult_ollama: Send prompts to Ollama models and get responses
list_ollama_models: List available models on the local Ollama instance
Installation
Ensure you have Node.js installed
Install dependencies:
npm installBuild the project:
npm run build
Usage
Make sure Ollama is running locally (default: (http://localhost:11434).
Start the MCP server:
Or for development:
Configuration
Set the OLLAMA_BASE_URL environment variable to change the Ollama endpoint:
Docker
To run with Docker, build the image:
Requirements
Node.js 18+
Ollama running locally or accessible via HTTP
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Enables consulting with local Ollama models for reasoning from alternative viewpoints. Supports sending prompts to Ollama models and listing available models on your local Ollama instance.