Skip to main content
Glama

MCP Ollama Consult Server

by Atomic-Germ

MCP Ollama Consult Server

An MCP (Model Context Protocol) server that allows consulting with Ollama models for reasoning from alternative viewpoints.

Features

  • consult_ollama: Send prompts to Ollama models and get responses

  • list_ollama_models: List available models on the local Ollama instance

Installation

  1. Ensure you have Node.js installed

  2. Install dependencies:

    npm install
  3. Build the project:

    npm run build

Usage

Make sure Ollama is running locally (default: (http://localhost:11434).

Start the MCP server:

npm start

Or for development:

npm run dev

Configuration

Set the OLLAMA_BASE_URL environment variable to change the Ollama endpoint:

OLLAMA_BASE_URL=http://your-ollama-server:11434 npm start

Docker

To run with Docker, build the image:

FROM node:18-alpine WORKDIR /app COPY package*.json ./ RUN npm ci --only=production COPY dist/ ./dist/ CMD ["node", "dist/index.js"]

Requirements

  • Node.js 18+

  • Ollama running locally or accessible via HTTP

Deploy Server
A
security – no known vulnerabilities
F
license - not found
A
quality - confirmed to work

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

Enables consulting with local Ollama models for reasoning from alternative viewpoints. Supports sending prompts to Ollama models and listing available models on your local Ollama instance.

  1. Features
    1. Installation
      1. Usage
        1. Configuration
          1. Docker
            1. Requirements

              MCP directory API

              We provide all the information about MCP servers via our MCP API.

              curl -X GET 'https://glama.ai/api/mcp/v1/servers/Atomic-Germ/mcp-consult'

              If you have feedback or need assistance with the MCP directory API, please join our Discord server