Skip to main content
Glama
jeel00dev

Excalidraw MCP Server

by jeel00dev

Excalidraw MCP Server

Generate beautiful Excalidraw diagrams from natural language — entirely locally, no cloud API needed.

You describe what you want ("draw a microservices architecture for an e-commerce app"), and the MCP server calls your local llama.cpp LLM to produce a valid .excalidraw file you can open instantly.


How It Works

You (Claude Desktop / Cursor)
        ↓  natural language description
  MCP Server (this project)
        ↓  structured prompt + Excalidraw JSON spec
  llama.cpp  (localhost:8080)
        ↓  raw Excalidraw JSON
  MCP Server  →  validates + saves  →  ~/excalidraw_diagrams/my-diagram.excalidraw
        ↓
  Open in Excalidraw

Prerequisites

Requirement

Version

Notes

Python

≥ 3.11

python3 --version

uv

latest

pip install uv (recommended)

llama.cpp

latest

see Step 1

A GGUF model

7B+ recommended

see Step 2

Excalidraw

web or local

see Step 5


Setup

Step 1 — Build llama.cpp

git clone https://github.com/ggerganov/llama.cpp
cd llama.cpp
cmake -B build
cmake --build build -j$(nproc)

On macOS with Apple Silicon, add -DLLAMA_METAL=ON for GPU acceleration.

Step 2 — Download a GGUF model

Recommended models (best JSON output quality):

Model

Size

HuggingFace path

Qwen2.5-7B-Instruct (recommended)

~4.5 GB

Qwen/Qwen2.5-7B-Instruct-GGUF

Llama-3.1-8B-Instruct

~4.7 GB

meta-llama/Meta-Llama-3.1-8B-Instruct-GGUF

Mistral-7B-Instruct-v0.3

~4.1 GB

mistralai/Mistral-7B-Instruct-v0.3-GGUF

# Inside the llama.cpp directory:
mkdir models
# Download with huggingface-cli (pip install huggingface_hub):
huggingface-cli download Qwen/Qwen2.5-7B-Instruct-GGUF \
    qwen2.5-7b-instruct-q4_k_m.gguf \
    --local-dir models/

Step 3 — Start the llama.cpp server

# From inside the llama.cpp directory:
./build/bin/llama-server \
    -m models/qwen2.5-7b-instruct-q4_k_m.gguf \
    --port 8080 \
    -c 8192 \
    --host 0.0.0.0

Verify it's running:

curl http://localhost:8080/health
# → {"status":"ok"}

Step 4 — Install the MCP server

# Clone this repo
git clone <repo-url>
cd exclalidraw_mcp

# Install with uv (recommended)
uv sync

# Or with pip
pip install -e .

Verify the CLI entry point works:

excalidraw-mcp --help

Step 5 — Configure your MCP client

Claude Desktop (Linux)

Edit ~/.config/claude/claude_desktop_config.json:

{
  "mcpServers": {
    "excalidraw": {
      "command": "excalidraw-mcp"
    }
  }
}

If using uv, replace "command": "excalidraw-mcp" with:

"command": "uv",
"args": ["--directory", "/absolute/path/to/exclalidraw_mcp", "run", "excalidraw-mcp"]

Claude Desktop (macOS)

Edit ~/Library/Application Support/Claude/claude_desktop_config.json with the same content.

Cursor / VS Code

Add to your MCP settings with the same server config above.

Restart the app after editing the config.

Step 6 — Run Excalidraw locally (optional)

You can always use excalidraw.com for free. But to run it fully locally:

docker run -p 5000:80 excalidraw/excalidraw:latest
# Open http://localhost:5000

Or via Node:

npx excalidraw

Usage

Once the MCP server is connected, ask your AI client:

Generate a flowchart for a user login system with OAuth
Draw a microservices architecture for an e-commerce platform with cart, payment, and inventory services
Create a mind map about machine learning: supervised, unsupervised, reinforcement learning
Make a sequence diagram showing a REST API request from browser to server to database and back
Draw an ER diagram for a blog: users, posts, comments, tags

Available MCP Tools

Tool

Description

generate_diagram(description, diagram_type, filename)

Main tool — generate a diagram from text

check_llm_status()

Verify llama.cpp is running

list_diagrams()

List all saved diagrams

generate_diagram parameters

Parameter

Type

Default

Description

description

string

required

What the diagram should show

diagram_type

string

"flowchart"

flowchart, mindmap, sequence, architecture, erd, freeform

filename

string

"diagram"

Output filename (no extension needed)

Opening a generated diagram

Diagrams are saved to ~/excalidraw_diagrams/.

  1. Open excalidraw.com or your local instance

  2. Click the folder icon (top left) → Open

  3. Select your .excalidraw file


Running Tests

# Install test dependencies
uv add --dev pytest pytest-anyio respx

# Run all tests
pytest tests/ -v

Troubleshooting

"llama.cpp server is not running"

Run curl http://localhost:8080/health. If it fails, start the server (Step 3).

"Could not parse LLM output as valid Excalidraw JSON"

The LLM returned malformed JSON. Try:

  • Use a better model (Qwen2.5-7B or larger)

  • Ensure llama.cpp started with -c 8192 (enough context)

  • Try a simpler description first to verify the pipeline works

"Diagram looks wrong / missing elements"

  • Be more specific in your description

  • Specify diagram_type explicitly (e.g., "flowchart" not "freeform")

  • Larger models (13B+) produce significantly better layout

Tool not appearing in Claude Desktop

  • Confirm claude_desktop_config.json has no JSON syntax errors

  • Restart Claude Desktop fully

  • Check logs: ~/.config/claude/logs/ (Linux) or ~/Library/Logs/Claude/ (macOS)


Project Structure

exclalidraw_mcp/
├── src/excalidraw_mcp/
│   ├── server.py       ← MCP server + tool definitions
│   ├── llm_client.py   ← llama.cpp HTTP client
│   ├── generator.py    ← Prompt building + JSON parsing + validation
│   └── schema.py       ← Excalidraw element dataclasses
├── prompts/
│   └── examples/       ← Few-shot example diagrams (flowchart, mindmap, sequence)
├── examples/
│   └── sample.excalidraw  ← Reference diagram you can open immediately
├── tests/
│   ├── test_generator.py
│   └── test_llm_client.py
├── pyproject.toml
└── README.md

Tips for Better Diagrams

  1. Be specific: "login flow with email/password, JWT token, and session storage" beats "login flow"

  2. Name your elements: "boxes labeled A, B, C connected by arrows" → Excalidraw follows your naming

  3. Specify colors: "use blue for services, yellow for databases"

  4. Keep it focused: One logical concept per diagram works better than trying to show everything

  5. Regenerate freely: If the first result isn't perfect, ask again with a different filename — it's instant


License

MIT

Install Server
F
license - not found
A
quality
C
maintenance

Resources

Unclaimed servers have limited discoverability.

Looking for Admin?

If you are the server author, to access and configure the admin panel.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/jeel00dev/exclalidraw_mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server