Skip to main content
Glama
Atomic-Germ

MCP Ollama Consult Server

MCP Ollama Consult Server

TypeScript MCP CI/CD

An MCP (Model Context Protocol) server that allows consulting with Ollama models for reasoning from alternative viewpoints.

Features

  • consult_ollama: Send prompts to Ollama models and get responses

  • list_ollama_models: List available models on the local Ollama instance

Installation

  1. Ensure you have Node.js installed

  2. Install dependencies:

    npm install
  3. Build the project:

    npm run build

Usage

Make sure Ollama is running locally (default: (http://localhost:11434).

Start the MCP server:

npm start

Or for development:

npm run dev

Configuration

Set the OLLAMA_BASE_URL environment variable to change the Ollama endpoint:

OLLAMA_BASE_URL=http://your-ollama-server:11434 npm start

Docker

To run with Docker, build the image:

FROM node:18-alpine WORKDIR /app COPY package*.json ./ RUN npm ci --only=production COPY dist/ ./dist/ CMD ["node", "dist/index.js"]

Requirements

  • Node.js 18+

  • Ollama running locally or accessible via HTTP

One-click Deploy
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Atomic-Germ/mcp-consult'

If you have feedback or need assistance with the MCP directory API, please join our Discord server