Skip to main content
Glama

Ollama MCP Server

A Model Context Protocol (MCP) server that provides tools for interacting with Ollama models. This server enables AI assistants to list, chat with, generate responses from, and manage Ollama models through a standardized protocol.

πŸš€ Features

  • Model Management: List, pull, and delete Ollama models

  • Chat Interface: Multi-turn conversations with models

  • Text Generation: Single-prompt text generation

  • Dual Transport: Stdio (local) and HTTP (remote) support

  • Railway Ready: Pre-configured for Railway deployment

  • Type Safe: Full TypeScript implementation with strict typing

πŸ“‹ Prerequisites

  • Node.js 18+

  • Ollama installed and running locally

  • For Railway deployment: Railway CLI

πŸ› οΈ Installation

Local Development

  1. Clone and install dependencies:

    git clone <repository-url>
    cd ollama-mcp
    npm install
  2. Build the project:

    npm run build
  3. Start the server:

    npm start

Using with Cursor

Add this to your Cursor MCP configuration (~/.cursor/mcp/config.json):

{
  "mcpServers": {
    "ollama": {
      "command": "node",
      "args": ["/path/to/ollama-mcp/dist/main.js"]
    }
  }
}

Quick setup:

curl -sSL https://raw.githubusercontent.com/your-repo/ollama-mcp/main/config/mcp.config.json -o ~/.cursor/mcp/config.json

πŸ—οΈ Architecture

The project is structured for maximum readability and maintainability:

src/
β”œβ”€β”€ main.ts                 # Main entry point
β”œβ”€β”€ config/                 # Configuration management
β”œβ”€β”€ server/                 # Core MCP server
β”œβ”€β”€ tools/                  # MCP tool implementations
β”œβ”€β”€ transports/             # Communication transports
└── ollama-client.ts        # Ollama API client

docs/                       # Comprehensive documentation
config/                     # Configuration files
scripts/                    # Deployment scripts

See ARCHITECTURE.md for detailed architecture documentation.

πŸ”§ Configuration

Environment Variables

Variable

Description

Default

MCP_TRANSPORT

Transport type (stdio or http)

stdio

OLLAMA_BASE_URL

Ollama API base URL

http://localhost:11434

MCP_HTTP_HOST

HTTP server host (HTTP mode)

0.0.0.0

MCP_HTTP_PORT

HTTP server port (HTTP mode)

8080

MCP_HTTP_ALLOWED_ORIGINS

CORS allowed origins (HTTP mode)

None

Transport Modes

Stdio Transport (Default)

Perfect for local development and direct integration:

npm start

HTTP Transport

Ideal for remote deployment and web-based clients:

MCP_TRANSPORT=http npm start

πŸš€ Deployment

Railway Deployment

  1. Install Railway CLI:

    npm install -g @railway/cli
    railway login
  2. Deploy:

    railway up
  3. Add models (optional):

    railway shell
    # Follow instructions in docs/RAILWAY_MODELS_SETUP.md

The Railway deployment automatically uses HTTP transport and exposes:

  • MCP Endpoint: https://your-app.railway.app/mcp

  • Health Check: https://your-app.railway.app/healthz

Docker Deployment

# Build the image
npm run docker:build

# Run locally
npm run docker:run

# Deploy to Railway
railway up

πŸ“š Available Tools

The server provides 5 MCP tools for Ollama interaction:

  1. ollama_list_models - List available models

  2. ollama_chat - Multi-turn conversations

  3. ollama_generate - Single-prompt generation

  4. ollama_pull_model - Download models

  5. ollama_delete_model - Remove models

See API.md for detailed API documentation.

πŸ§ͺ Testing

Local Testing

# Test stdio transport
npm start

# Test HTTP transport
MCP_TRANSPORT=http npm start

# Test health check (HTTP mode)
curl http://localhost:8080/healthz

Model Testing

# List available models
ollama list

# Test a model
ollama run llama2 "Hello, how are you?"

πŸ“– Documentation

🀝 Contributing

  1. Fork the repository

  2. Create a feature branch

  3. Make your changes

  4. Add tests if applicable

  5. Submit a pull request

πŸ“„ License

MIT License - see LICENSE for details.

πŸ†˜ Troubleshooting

Common Issues

"Cannot find module" errors:

npm install
npm run build

Ollama connection issues:

# Check if Ollama is running
ollama list

# Check Ollama service
ollama serve

Railway deployment issues:

# Check Railway logs
railway logs

# Verify environment variables
railway variables

Getting Help


Built with ❀️ for the AI community

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/etnlbck/ollama-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server