Excalidraw MCP Server
Generates Excalidraw diagrams from natural language descriptions, saving them as .excalidraw files that can be opened in Excalidraw for further editing.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Excalidraw MCP ServerGenerate a flowchart for a user login system with OAuth"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Excalidraw MCP Server
Generate beautiful Excalidraw diagrams from natural language — entirely locally, no cloud API needed.
You describe what you want ("draw a microservices architecture for an e-commerce app"), and the MCP server calls your local llama.cpp LLM to produce a valid .excalidraw file you can open instantly.
How It Works
You (Claude Desktop / Cursor)
↓ natural language description
MCP Server (this project)
↓ structured prompt + Excalidraw JSON spec
llama.cpp (localhost:8080)
↓ raw Excalidraw JSON
MCP Server → validates + saves → ~/excalidraw_diagrams/my-diagram.excalidraw
↓
Open in ExcalidrawPrerequisites
Requirement | Version | Notes |
Python | ≥ 3.11 |
|
uv | latest |
|
llama.cpp | latest | see Step 1 |
A GGUF model | 7B+ recommended | see Step 2 |
Excalidraw | web or local | see Step 5 |
Setup
Step 1 — Build llama.cpp
git clone https://github.com/ggerganov/llama.cpp
cd llama.cpp
cmake -B build
cmake --build build -j$(nproc)On macOS with Apple Silicon, add
-DLLAMA_METAL=ONfor GPU acceleration.
Step 2 — Download a GGUF model
Recommended models (best JSON output quality):
Model | Size | HuggingFace path |
Qwen2.5-7B-Instruct (recommended) | ~4.5 GB |
|
Llama-3.1-8B-Instruct | ~4.7 GB |
|
Mistral-7B-Instruct-v0.3 | ~4.1 GB |
|
# Inside the llama.cpp directory:
mkdir models
# Download with huggingface-cli (pip install huggingface_hub):
huggingface-cli download Qwen/Qwen2.5-7B-Instruct-GGUF \
qwen2.5-7b-instruct-q4_k_m.gguf \
--local-dir models/Step 3 — Start the llama.cpp server
# From inside the llama.cpp directory:
./build/bin/llama-server \
-m models/qwen2.5-7b-instruct-q4_k_m.gguf \
--port 8080 \
-c 8192 \
--host 0.0.0.0Verify it's running:
curl http://localhost:8080/health
# → {"status":"ok"}Step 4 — Install the MCP server
# Clone this repo
git clone <repo-url>
cd exclalidraw_mcp
# Install with uv (recommended)
uv sync
# Or with pip
pip install -e .Verify the CLI entry point works:
excalidraw-mcp --helpStep 5 — Configure your MCP client
Claude Desktop (Linux)
Edit ~/.config/claude/claude_desktop_config.json:
{
"mcpServers": {
"excalidraw": {
"command": "excalidraw-mcp"
}
}
}If using
uv, replace"command": "excalidraw-mcp"with:"command": "uv", "args": ["--directory", "/absolute/path/to/exclalidraw_mcp", "run", "excalidraw-mcp"]
Claude Desktop (macOS)
Edit ~/Library/Application Support/Claude/claude_desktop_config.json with the same content.
Cursor / VS Code
Add to your MCP settings with the same server config above.
Restart the app after editing the config.
Step 6 — Run Excalidraw locally (optional)
You can always use excalidraw.com for free. But to run it fully locally:
docker run -p 5000:80 excalidraw/excalidraw:latest
# Open http://localhost:5000Or via Node:
npx excalidrawUsage
Once the MCP server is connected, ask your AI client:
Generate a flowchart for a user login system with OAuthDraw a microservices architecture for an e-commerce platform with cart, payment, and inventory servicesCreate a mind map about machine learning: supervised, unsupervised, reinforcement learningMake a sequence diagram showing a REST API request from browser to server to database and backDraw an ER diagram for a blog: users, posts, comments, tagsAvailable MCP Tools
Tool | Description |
| Main tool — generate a diagram from text |
| Verify llama.cpp is running |
| List all saved diagrams |
generate_diagram parameters
Parameter | Type | Default | Description |
| string | required | What the diagram should show |
| string |
|
|
| string |
| Output filename (no extension needed) |
Opening a generated diagram
Diagrams are saved to ~/excalidraw_diagrams/.
Open excalidraw.com or your local instance
Click the folder icon (top left) → Open
Select your
.excalidrawfile
Running Tests
# Install test dependencies
uv add --dev pytest pytest-anyio respx
# Run all tests
pytest tests/ -vTroubleshooting
"llama.cpp server is not running"
Run curl http://localhost:8080/health. If it fails, start the server (Step 3).
"Could not parse LLM output as valid Excalidraw JSON"
The LLM returned malformed JSON. Try:
Use a better model (Qwen2.5-7B or larger)
Ensure llama.cpp started with
-c 8192(enough context)Try a simpler description first to verify the pipeline works
"Diagram looks wrong / missing elements"
Be more specific in your description
Specify
diagram_typeexplicitly (e.g.,"flowchart"not"freeform")Larger models (13B+) produce significantly better layout
Tool not appearing in Claude Desktop
Confirm
claude_desktop_config.jsonhas no JSON syntax errorsRestart Claude Desktop fully
Check logs:
~/.config/claude/logs/(Linux) or~/Library/Logs/Claude/(macOS)
Project Structure
exclalidraw_mcp/
├── src/excalidraw_mcp/
│ ├── server.py ← MCP server + tool definitions
│ ├── llm_client.py ← llama.cpp HTTP client
│ ├── generator.py ← Prompt building + JSON parsing + validation
│ └── schema.py ← Excalidraw element dataclasses
├── prompts/
│ └── examples/ ← Few-shot example diagrams (flowchart, mindmap, sequence)
├── examples/
│ └── sample.excalidraw ← Reference diagram you can open immediately
├── tests/
│ ├── test_generator.py
│ └── test_llm_client.py
├── pyproject.toml
└── README.mdTips for Better Diagrams
Be specific: "login flow with email/password, JWT token, and session storage" beats "login flow"
Name your elements: "boxes labeled A, B, C connected by arrows" → Excalidraw follows your naming
Specify colors: "use blue for services, yellow for databases"
Keep it focused: One logical concept per diagram works better than trying to show everything
Regenerate freely: If the first result isn't perfect, ask again with a different filename — it's instant
License
MIT
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/jeel00dev/exclalidraw_mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server