Skip to main content
Glama

Ollama MCP Server

A Model Context Protocol (MCP) server that provides tools for interacting with Ollama models. This server enables AI assistants to list, chat with, generate responses from, and manage Ollama models through a standardized protocol.

๐Ÿš€ Features

  • Model Management: List, pull, and delete Ollama models

  • Chat Interface: Multi-turn conversations with models

  • Text Generation: Single-prompt text generation

  • Dual Transport: Stdio (local) and HTTP (remote) support

  • Railway Ready: Pre-configured for Railway deployment

  • Type Safe: Full TypeScript implementation with strict typing

๐Ÿ“‹ Prerequisites

  • Node.js 18+

  • Ollama installed and running locally

  • For Railway deployment: Railway CLI

๐Ÿ› ๏ธ Installation

Local Development

  1. Clone and install dependencies:

    git clone <repository-url> cd ollama-mcp npm install
  2. Build the project:

    npm run build
  3. Start the server:

    npm start

Using with Cursor

Add this to your Cursor MCP configuration (~/.cursor/mcp/config.json):

{ "mcpServers": { "ollama": { "command": "node", "args": ["/path/to/ollama-mcp/dist/main.js"] } } }

Quick setup:

curl -sSL https://raw.githubusercontent.com/your-repo/ollama-mcp/main/config/mcp.config.json -o ~/.cursor/mcp/config.json

๐Ÿ—๏ธ Architecture

The project is structured for maximum readability and maintainability:

src/ โ”œโ”€โ”€ main.ts # Main entry point โ”œโ”€โ”€ config/ # Configuration management โ”œโ”€โ”€ server/ # Core MCP server โ”œโ”€โ”€ tools/ # MCP tool implementations โ”œโ”€โ”€ transports/ # Communication transports โ””โ”€โ”€ ollama-client.ts # Ollama API client docs/ # Comprehensive documentation config/ # Configuration files scripts/ # Deployment scripts

See ARCHITECTURE.md for detailed architecture documentation.

๐Ÿ”ง Configuration

Environment Variables

Variable

Description

Default

MCP_TRANSPORT

Transport type (

stdio

or

http

)

stdio

OLLAMA_BASE_URL

Ollama API base URL

http://localhost:11434

MCP_HTTP_HOST

HTTP server host (HTTP mode)

0.0.0.0

MCP_HTTP_PORT

HTTP server port (HTTP mode)

8080

MCP_HTTP_ALLOWED_ORIGINS

CORS allowed origins (HTTP mode)

None

Transport Modes

Stdio Transport (Default)

Perfect for local development and direct integration:

npm start

HTTP Transport

Ideal for remote deployment and web-based clients:

MCP_TRANSPORT=http npm start

๐Ÿš€ Deployment

Railway Deployment

  1. Install Railway CLI:

    npm install -g @railway/cli railway login
  2. Deploy:

    railway up
  3. Add models (optional):

    railway shell # Follow instructions in docs/RAILWAY_MODELS_SETUP.md

The Railway deployment automatically uses HTTP transport and exposes:

  • MCP Endpoint: https://your-app.railway.app/mcp

  • Health Check: https://your-app.railway.app/healthz

Docker Deployment

# Build the image npm run docker:build # Run locally npm run docker:run # Deploy to Railway railway up

๐Ÿ“š Available Tools

The server provides 5 MCP tools for Ollama interaction:

  1. ollama_list_models - List available models

  2. ollama_chat - Multi-turn conversations

  3. ollama_generate - Single-prompt generation

  4. ollama_pull_model - Download models

  5. ollama_delete_model - Remove models

See API.md for detailed API documentation.

๐Ÿงช Testing

Local Testing

# Test stdio transport npm start # Test HTTP transport MCP_TRANSPORT=http npm start # Test health check (HTTP mode) curl http://localhost:8080/healthz

Model Testing

# List available models ollama list # Test a model ollama run llama2 "Hello, how are you?"

๐Ÿ“– Documentation

๐Ÿค Contributing

  1. Fork the repository

  2. Create a feature branch

  3. Make your changes

  4. Add tests if applicable

  5. Submit a pull request

๐Ÿ“„ License

MIT License - see LICENSE for details.

๐Ÿ†˜ Troubleshooting

Common Issues

"Cannot find module" errors:

npm install npm run build

Ollama connection issues:

# Check if Ollama is running ollama list # Check Ollama service ollama serve

Railway deployment issues:

# Check Railway logs railway logs # Verify environment variables railway variables

Getting Help


Built with โค๏ธ for the AI community

-
security - not tested
F
license - not found
-
quality - not tested

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/etnlbck/ollama-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server