Skip to main content
Glama
Atomic-Germ

MCP Ollama Consult Server

MCP Ollama Consult Server

TypeScript MCP CI/CD Ollama Consult Server MCP

An MCP (Model Context Protocol) server that allows consulting with Ollama models for reasoning from alternative viewpoints.

Features

  • consult_ollama: Send prompts to Ollama models and get responses

  • list_ollama_models: List available models on the local Ollama instance

  • compare_ollama_models: Run the ame prompt against multiple Ollama models and return their outputs side-by-side for comparison

  • remember_consult: Store the result of a consult into a local memory store (or configured memory service)

Installation

  1. Install the server:

    npm i -g https://github.com/Atomic-Germ/mcp-consult/releases/download/v1.0.1/mcp-ollama-consult-1.0.1.tgz
  2. Configure the server:

    { "servers": { "ollama-consult": { "type": "stdio", "command": "mcp-ollama-consult", "args": [] } }, "inputs": [] }

Usage

Make sure Ollama is running locally (default: (http://localhost:11434).

Start the MCP server:

mcp-ollama-consult

Or for development:

npm run dev

Configuration

Set the OLLAMA_BASE_URL environment variable to change the Ollama endpoint:

OLLAMA_BASE_URL=http://your-ollama-server:11434 npm start

Docker

To run with Docker, build the image:

FROM node:18-alpine WORKDIR /app COPY package*.json ./ RUN npm ci --only=production COPY dist/ ./dist/ CMD ["node", "dist/index.js"]

Requirements

  • Node.js 18+

  • Ollama running locally or accessible via HTTP

Install Server
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Atomic-Germ/mcp-consult'

If you have feedback or need assistance with the MCP directory API, please join our Discord server