# MCP Ollama Consult Server
[](https://www.typescriptlang.org/)
[](https://modelcontextprotocol.io)
[](https://github.com/Atomic-Germ/mcp-consult/actions/workflows/ci.yml)
[](https://glama.ai/mcp/servers/@Atomic-Germ/mcp-consult)
An MCP (Model Context Protocol) server that allows consulting with Ollama models for reasoning from alternative viewpoints.
## Features
- **consult_ollama**: Send prompts to Ollama models and get responses
- **list_ollama_models**: List available models on the local Ollama instance
- **compare_ollama_models**: Run the ame prompt against multiple Ollama models and return their outputs side-by-side for comparison
- **remember_consult**: Store the result of a consult into a local memory store (or configured memory service)
## Installation
1. Install the server:
```bash
npm i -g https://github.com/Atomic-Germ/mcp-consult/releases/download/v1.0.1/mcp-ollama-consult-1.0.1.tgz
```
2. Configure the server:
```json
{
"servers": {
"ollama-consult": {
"type": "stdio",
"command": "mcp-ollama-consult",
"args": []
}
},
"inputs": []
}
```
## Usage
Make sure Ollama is running locally (default: ([http://localhost:11434](http://localhost:11434)).
Start the MCP server:
```bash
mcp-ollama-consult
```
Or for development:
```bash
npm run dev
```
## Configuration
Set the `OLLAMA_BASE_URL` environment variable to change the Ollama endpoint:
```bash
OLLAMA_BASE_URL=http://your-ollama-server:11434 npm start
```
## Docker
To run with Docker, build the image:
```dockerfile
FROM node:18-alpine
WORKDIR /app
COPY package*.json ./
RUN npm ci --only=production
COPY dist/ ./dist/
CMD ["node", "dist/index.js"]
```
## Requirements
- Node.js 18+
- Ollama running locally or accessible via HTTP