The openclaw-mcp server bridges AI clients (like Claude.ai) with the OpenClaw AI assistant gateway, enabling both synchronous and asynchronous interactions.
openclaw_chatβ Send a message to OpenClaw and receive an immediate response; supports optional session ID for conversation continuity.openclaw_statusβ Check the health and operational status of the OpenClaw gateway.openclaw_chat_asyncβ Queue a message for asynchronous processing and receive atask_idimmediately; supports priority levels and session IDs.openclaw_task_statusβ Poll the progress of a queued async task and retrieve its result once complete.openclaw_task_listβ List all async tasks with optional filtering by status (pending,running,completed,failed,cancelled) or session ID.openclaw_task_cancelβ Cancel a pending task that hasn't started processing yet.
OpenClaw MCP Server
π¦ Model Context Protocol (MCP) server for OpenClaw AI assistant integration.
Demo
Why I Built This
Hey! I created this MCP server because I didn't want to rely solely on messaging channels to communicate with OpenClaw. What really excites me is the ability to connect OpenClaw to the Claude web UI. Essentially, my chat can delegate tasks to my Claw bot, which then handles everything else β like spinning up Claude Code to fix issues for me.
Think of it as an AI assistant orchestrating another AI assistant. Pretty cool, right?
Quick Start
Docker (Recommended)
Pre-built images are published to GitHub Container Registry on every release.
docker pull ghcr.io/freema/openclaw-mcp:latestCreate a docker-compose.yml:
services:
mcp-bridge:
image: ghcr.io/freema/openclaw-mcp:latest
container_name: openclaw-mcp
restart: unless-stopped
ports:
- "3000:3000"
environment:
- OPENCLAW_URL=http://host.docker.internal:18789
- OPENCLAW_GATEWAY_TOKEN=${OPENCLAW_GATEWAY_TOKEN}
- AUTH_ENABLED=true
- MCP_CLIENT_ID=openclaw
- MCP_CLIENT_SECRET=${MCP_CLIENT_SECRET}
- MCP_ISSUER_URL=${MCP_ISSUER_URL:-}
- CORS_ORIGINS=https://claude.ai
extra_hosts:
- "host.docker.internal:host-gateway"
read_only: true
security_opt:
- no-new-privilegesGenerate secrets and start:
export MCP_CLIENT_SECRET=$(openssl rand -hex 32)
export OPENCLAW_GATEWAY_TOKEN=your-gateway-token
docker compose up -dThen in Claude.ai add a custom MCP connector pointing to your server with MCP_CLIENT_ID=openclaw and your MCP_CLIENT_SECRET.
Tip: Pin a specific version instead of
latestfor production:ghcr.io/freema/openclaw-mcp:1.1.0
Local (Claude Desktop)
npx openclaw-mcpAdd to your Claude Desktop config:
{
"mcpServers": {
"openclaw": {
"command": "npx",
"args": ["openclaw-mcp"],
"env": {
"OPENCLAW_URL": "http://127.0.0.1:18789",
"OPENCLAW_GATEWAY_TOKEN": "your-gateway-token",
"OPENCLAW_TIMEOUT_MS": "300000"
}
}
}
}Remote (Claude.ai) without Docker
AUTH_ENABLED=true MCP_CLIENT_ID=openclaw MCP_CLIENT_SECRET=your-secret \
MCP_ISSUER_URL=https://mcp.your-domain.com \
CORS_ORIGINS=https://claude.ai OPENCLAW_GATEWAY_TOKEN=your-gateway-token \
npx openclaw-mcp --transport sse --port 3000Important: When running behind a reverse proxy (Caddy, nginx, etc.), you must set
MCP_ISSUER_URL(or--issuer-url) to your public HTTPS URL. Without this, OAuth metadata will advertisehttp://localhost:3000and clients will fail to authenticate.
See Installation Guide for details.
Architecture
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Your Server β
β β
β βββββββββββββββββββ βββββββββββββββββββββββββββ β
β β OpenClaw β β OpenClaw MCP β β
β β Gateway βββββββΊβ Bridge Server β β
β β :18789 β β :3000 β β
β β β β β β
β β OpenAI-compat β β - OAuth 2.1 auth β β
β β /v1/chat/... β β - CORS protection β β
β βββββββββββββββββββ β - Input validation β β
β ββββββββββββ¬βββββββββββββββ β
β β β
ββββββββββββββββββββββββββββββββββββββββΌβββββββββββββββββββββββββββ
β HTTPS + OAuth 2.1
βΌ
βββββββββββββββββββ
β Claude.ai β
β (MCP Client) β
βββββββββββββββββββAvailable Tools
Sync Tools
Tool | Description |
| Send messages to OpenClaw and get responses |
| Check OpenClaw gateway health |
| List all configured OpenClaw instances |
Async Tools (for long-running operations)
Tool | Description |
| Queue a message, get task_id immediately |
| Check task progress and get results |
| List all tasks with filtering |
| Cancel a pending task |
Multi-Instance Mode
Orchestrate multiple OpenClaw gateways from a single MCP server. One bridge, many claws β route requests to prod, staging, dev, or whatever you name them (lobster-supreme and the-claw-abides are perfectly valid names).
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Claude.ai / Claude Desktop β
β (MCP Client) β
ββββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββββββββββββββββ
β
βΌ
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β OpenClaw MCP Bridge Server β
β β
β ββββββββββββββββ ββββββββββββββββ ββββββββββββββββ β
β β Instance β β Instance β β Instance β β
β β Registry β β Resolver β β Validator β β
β ββββββββ¬ββββββββ ββββββββ¬ββββββββ ββββββββ¬ββββββββ β
β β β β β
β ββββββββ΄ββββββββββββββββββ΄βββββββββββββββββββ΄ββββββββ β
β β Per-Instance OpenClaw Clients β β
β β (separate auth, timeout, URL per instance) β β
β ββββββββββ¬βββββββββββββββ¬βββββββββββββββ¬βββββββββββββ β
βββββββββββββΌβββββββββββββββΌβββββββββββββββΌβββββββββββββββββββββββββββββ
β β β
βΌ βΌ βΌ
ββββββββββββββββ ββββββββββββββββ ββββββββββββββββ
β π¦ prod β β π¦ staging β β π¦ dev β
β (default) β β β β β
β :18789 β β :18789 β β :18789 β
β OpenClaw GW β β OpenClaw GW β β OpenClaw GW β
ββββββββββββββββ ββββββββββββββββ ββββββββββββββββSetup
OPENCLAW_INSTANCES='[
{"name": "prod", "url": "http://prod:18789", "token": "tok1", "default": true},
{"name": "staging", "url": "http://staging:18789", "token": "tok2"},
{"name": "dev", "url": "http://dev:18789", "token": "tok3"}
]'Usage
All tools accept an optional instance parameter to target a specific gateway:
# Chat with staging instance
openclaw_chat message="Deploy status?" instance="staging"
# Check health of prod
openclaw_status instance="prod"
# List all configured instances
openclaw_instances
# Async task targeting dev
openclaw_chat_async message="Run tests" instance="dev"When instance is omitted, the default instance is used. Each instance has its own auth token, timeout, and URL β fully isolated.
Key Features
Zero-migration upgrade β existing single-instance deployments work without any config change
Per-instance isolation β separate auth tokens, timeouts, and URLs
Dynamic routing β Claude picks the right instance per request
Task tracking β async tasks remember which instance they target
Security β tokens are never exposed via
openclaw_instances
See Configuration β Multi-Instance Mode for the full reference.
Documentation
Installation β Setup for Claude Desktop & Claude.ai
Configuration β Environment variables & options
Deployment β Docker & production setup
Threat Model β What Claude can/can't trigger, trust boundaries & attack surfaces
Logging β What gets logged, where, and what is never logged
Development β Contributing & adding tools
Security β Security policy & best practices
Security
β οΈ Always enable authentication in production!
# Generate secure client secret
export MCP_CLIENT_SECRET=$(openssl rand -hex 32)
# Run with auth enabled
AUTH_ENABLED=true MCP_CLIENT_ID=openclaw MCP_CLIENT_SECRET=$MCP_CLIENT_SECRET \
openclaw-mcp --transport sseConfigure CORS to restrict access:
CORS_ORIGINS=https://claude.ai,https://your-app.comSee Configuration for all security options.
Requirements
Node.js β₯ 20
OpenClaw gateway running with HTTP API enabled:
// openclaw.json { "gateway": { "http": { "endpoints": { "chatCompletions": { "enabled": true } } } } }
License
MIT
Author
Created by TomΓ‘Ε‘ Grasl
Related Projects
OpenClaw β The AI assistant this MCP connects to
MCP Specification β Model Context Protocol docs