Skip to main content
Glama

MCP Ollama Consult Server

by Atomic-Germ
README.md•1.71 kB
# MCP Ollama Consult Server [![TypeScript](https://img.shields.io/badge/TypeScript-5.0+-blue.svg)](https://www.typescriptlang.org/) [![MCP](https://img.shields.io/badge/MCP-Compatible-green.svg)](https://modelcontextprotocol.io) [![CI/CD](https://github.com/Atomic-Germ/mcp-consult/actions/workflows/ci.yml/badge.svg)](https://github.com/Atomic-Germ/mcp-consult/actions/workflows/ci.yml) An MCP (Model Context Protocol) server that allows consulting with Ollama models for reasoning from alternative viewpoints. <a href="https://glama.ai/mcp/servers/@Atomic-Germ/mcp-consult"> <img width="380" height="200" src="https://glama.ai/mcp/servers/@Atomic-Germ/mcp-consult/badge" alt="Ollama Consult Server MCP server" /> </a> ## Features - **consult_ollama**: Send prompts to Ollama models and get responses - **list_ollama_models**: List available models on the local Ollama instance ## Installation 1. Ensure you have Node.js installed 2. Install dependencies: ```bash npm install ``` 3. Build the project: ```bash npm run build ``` ## Usage Make sure Ollama is running locally (default: ([http://localhost:11434](http://localhost:11434)). Start the MCP server: ```bash npm start ``` Or for development: ```bash npm run dev ``` ## Configuration Set the `OLLAMA_BASE_URL` environment variable to change the Ollama endpoint: ```bash OLLAMA_BASE_URL=http://your-ollama-server:11434 npm start ``` ## Docker To run with Docker, build the image: ```dockerfile FROM node:18-alpine WORKDIR /app COPY package*.json ./ RUN npm ci --only=production COPY dist/ ./dist/ CMD ["node", "dist/index.js"] ``` ## Requirements - Node.js 18+ - Ollama running locally or accessible via HTTP

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Atomic-Germ/mcp-consult'

If you have feedback or need assistance with the MCP directory API, please join our Discord server