π₯ Demo
β Try it live at flompt.dev β no account, no install needed.
Paste any prompt β AI decomposes it into blocks β drag & reorder β get a Claude-optimized XML prompt.

β¨ What is flompt?
flompt is a visual prompt engineering tool that transforms how you write AI prompts.
Instead of writing one long block of text, flompt lets you:
Decompose: Paste any prompt and let AI break it into structured blocks
Edit visually: Drag, connect, and reorder blocks in a flowchart editor
Recompile: Generate a Claude-optimized, machine-ready prompt from your flow
Think of it as Figma for prompts: visual, structured, and built for Claude.
π§© Block Types
12 specialized blocks that map directly to Claude's prompt engineering best practices:
Block | Purpose | Claude XML |
Document | External content grounding |
|
Role | AI persona & expertise |
|
Audience | Who the output is written for |
|
Context | Background information |
|
Objective | What to DO |
|
Goal | End goal & success criteria |
|
Input | Data you're providing |
|
Constraints | Rules & limitations |
|
Examples | Few-shot demonstrations |
|
Chain of Thought | Step-by-step reasoning |
|
Output Format | Expected output structure |
|
Response Style | Verbosity, tone, prose, markdown (structured UI) |
|
Language | Response language |
|
Blocks are automatically ordered following Anthropic's recommended prompt structure.
π Try It Now
β flompt.dev β No account needed. Free and open-source.
π§© Browser Extension
Use flompt directly inside ChatGPT, Claude, and Gemini. Without leaving your tab.
β¦ Enhance button injected into the AI chat input
Bidirectional sync between the sidebar and the chat
Works on ChatGPT Β· Claude Β· Gemini
π€ Claude Code Integration (MCP)
flompt exposes its core capabilities as native tools inside Claude Code via the Model Context Protocol (MCP).
Once configured, you can call decompose_prompt, compile_prompt, and list_block_types directly from any Claude Code conversation β no browser, no copy-paste.
Installation
Option 1 β CLI (recommended):
claude mcp add --transport http --scope user flompt https://flompt.dev/mcp/The --scope user flag makes flompt available in all your Claude Code projects.
Option 2 β
{
"mcpServers": {
"flompt": {
"type": "http",
"url": "https://flompt.dev/mcp/"
}
}
}Available Tools
Once connected, 3 tools are available in Claude Code:
decompose_prompt(prompt: str)
Breaks down a raw prompt into structured blocks (role, objective, context, constraints, etc.).
Uses Claude or GPT on the server if an API key is configured
Falls back to keyword-based heuristic analysis otherwise
Returns a list of typed blocks + full JSON to pass to
compile_prompt
Input: "You are a Python expert. Write a function that parses JSON and handles errors."
Output: β
3 blocks extracted:
[ROLE] You are a Python expert.
[OBJECTIVE] Write a function that parses JSONβ¦
[CONSTRAINTS] handles errors
π Full blocks JSON: [{"id": "...", "type": "role", ...}, ...]compile_prompt(blocks_json: str)
Compiles a list of blocks into a Claude-optimized XML prompt.
Takes the JSON from
decompose_prompt(or manually crafted blocks)Reorders blocks following Anthropic's recommended structure
Returns the final XML prompt with an estimated token count
Input: [{"type": "role", "content": "You are a Python expert", ...}, ...]
Output: β
Prompt compiled (142 estimated tokens):
<role>You are a Python expert.</role>
<objective>Write a function that parses JSON and handles errors.</objective>list_block_types()
Lists all 12 available block types with descriptions and the recommended canonical ordering. Useful when manually crafting blocks.
Typical Workflow
1. decompose_prompt("your raw prompt here")
β get structured blocks as JSON
2. (optionally edit the JSON to add/remove/modify blocks)
3. compile_prompt("<json from step 1>")
β get Claude-optimized XML prompt, ready to useTechnical Details
Property | Value |
Transport | Streamable HTTP (POST) |
Endpoint |
|
Session | Stateless (each call is independent) |
Auth | None required |
DNS rebinding protection | Enabled ( |
π οΈ Self-Hosting (Local Dev)
Requirements
Python 3.12+
Node.js 18+
An Anthropic or OpenAI API key (optional β heuristic fallback works without one)
Setup
Backend:
cd backend
python -m venv .venv && source .venv/bin/activate
pip install -r requirements.txt
cp .env.example .env # add your API key
uvicorn app.main:app --reload --port 8000App (Frontend):
cd app
cp .env.example .env # optional: add PostHog key
npm install
npm run devBlog:
cd blog
npm install
npm run dev # available at http://localhost:3000/blogService | URL |
App | |
Backend API | |
API Docs (Swagger) | |
MCP endpoint |
βοΈ AI Configuration
flompt supports multiple AI providers. Copy backend/.env.example to backend/.env:
# Anthropic (recommended)
ANTHROPIC_API_KEY=sk-ant-...
AI_PROVIDER=anthropic
AI_MODEL=claude-3-5-haiku-20241022
# or OpenAI
OPENAI_API_KEY=sk-...
AI_PROVIDER=openai
AI_MODEL=gpt-4o-miniNo API key? No problem β flompt falls back to a heuristic decomposer (keyword-based) and structured XML compilation.
π’ Production Deployment
This section documents the exact production setup running at flompt.dev. Everything lives in /projects/flompt.
Architecture
Internet
β
βΌ
Caddy (auto-TLS, reverse proxy) β port 443/80
βββ /app* β Vite SPA static files (app/dist/)
βββ /blog* β Next.js static export (blog/out/)
βββ /api/* β FastAPI backend (localhost:8000)
βββ /mcp/* β FastAPI MCP server (localhost:8000, no buffering)
βββ /docs* β Reverse proxy to GitBook
βββ / β Static landing page (landing/)
β
FastAPI (uvicorn, port 8000)
β
Anthropic / OpenAI APIBoth Caddy and the FastAPI backend are managed by supervisord, itself watched by a keepalive loop.
1. Prerequisites
# Python 3.12+ with pip
python --version
# Node.js 18+
node --version
# Caddy binary placed at /projects/flompt/caddy
# (not committed to git β download from https://caddyserver.com/download)
curl -o caddy "https://caddyserver.com/api/download?os=linux&arch=amd64"
chmod +x caddy
# supervisor installed in a Python virtualenv
pip install supervisor2. Environment Variables
Backend (backend/.env):
ANTHROPIC_API_KEY=sk-ant-... # or OPENAI_API_KEY
AI_PROVIDER=anthropic # or: openai
AI_MODEL=claude-3-5-haiku-20241022 # model to use for decompose/compileApp frontend (app/.env):
VITE_POSTHOG_KEY=phc_... # optional analytics
VITE_POSTHOG_HOST=https://eu.i.posthog.comBlog (blog/.env.local):
NEXT_PUBLIC_POSTHOG_KEY=phc_...
NEXT_PUBLIC_POSTHOG_HOST=https://eu.i.posthog.com3. Build
All assets must be built before starting services. Use the deploy script or manually:
Full deploy (build + restart + health check):
cd /projects/flompt
./deploy.shBuild only (no service restart):
./deploy.sh --build-onlyRestart only (no rebuild):
./deploy.sh --restart-onlyManual build steps:
# 1. Vite SPA β app/dist/
cd /projects/flompt/app
npm run build
# Output: app/dist/ (pre-compressed with gzip, served by Caddy)
# 2. Next.js blog β blog/out/
cd /projects/flompt/blog
rm -rf .next out # clear cache to avoid stale builds
npm run build
# Output: blog/out/ (full static export, no Node server needed)4. Process Management
Production processes are managed by supervisord (supervisord.conf):
Program | Command | Port | Log |
|
| 8000 |
|
|
| 443/80 |
|
Both programs have autorestart=true and startretries=5 β they automatically restart on crash.
Start supervisord (first boot or after a full restart):
supervisord -c /projects/flompt/supervisord.confCommon supervisorctl commands:
# Check status of all programs
supervisorctl -c /projects/flompt/supervisord.conf status
# Restart backend only (e.g. after a code change)
supervisorctl -c /projects/flompt/supervisord.conf restart flompt-backend
# Restart Caddy only (e.g. after a Caddyfile change)
supervisorctl -c /projects/flompt/supervisord.conf restart flompt-caddy
# Restart everything
supervisorctl -c /projects/flompt/supervisord.conf restart all
# Stop everything
supervisorctl -c /projects/flompt/supervisord.conf stop all
# Read real-time logs
tail -f /tmp/flompt-backend.log
tail -f /tmp/flompt-caddy.log
tail -f /tmp/flompt-supervisord.log5. Keepalive Watchdog
keepalive.sh is an infinite bash loop (running as a background process) that:
Checks every 30 seconds whether supervisord is alive
If supervisord is down, kills any zombie process occupying port 8000 (via inode lookup in
/proc/net/tcp)Restarts supervisord
Logs all events to
/tmp/flompt-keepalive.log
Start keepalive (should be running at all times):
nohup /projects/flompt/keepalive.sh >> /tmp/flompt-keepalive.log 2>&1 &
echo $! # note the PIDCheck if keepalive is running:
ps aux | grep keepalive.sh
tail -f /tmp/flompt-keepalive.logNote:
keepalive.shuses the same Python virtualenv path as supervisord. If you reinstall supervisor in a different venv, updateSUPERVISORDandSUPERVISORCTLpaths at the top ofkeepalive.sh.
6. Caddy Configuration
Caddyfile handles all routing for flompt.dev. Key rules (in priority order):
/blog* β Static Next.js export at blog/out/
/api/* β FastAPI backend at localhost:8000
/health β FastAPI health check
/mcp/* β FastAPI MCP server (flush_interval -1 for streaming)
/mcp β 308 redirect to /mcp/ (avoids upstream 307 issues)
/docs* β Reverse proxy to GitBook (external)
/app* β Vite SPA at app/dist/ (gzip precompressed)
/ β Static landing page at landing/Reload Caddy after a Caddyfile change:
supervisorctl -c /projects/flompt/supervisord.conf restart flompt-caddy
# or directly:
/projects/flompt/caddy reload --config /projects/flompt/CaddyfileCaddy auto-manages TLS certificates via Let's Encrypt β no manual SSL setup needed.
7. Health Checks
The deploy script runs these checks automatically. You can run them manually:
# Backend API
curl -s https://flompt.dev/health
# β {"status":"ok","service":"flompt-api"}
# Landing page
curl -s -o /dev/null -w "%{http_code}" https://flompt.dev/
# β 200
# Vite SPA
curl -s -o /dev/null -w "%{http_code}" https://flompt.dev/app
# β 200
# Blog
curl -s -o /dev/null -w "%{http_code}" https://flompt.dev/blog/en
# β 200
# MCP endpoint (requires Accept header)
curl -s -o /dev/null -w "%{http_code}" \
-X POST https://flompt.dev/mcp/ \
-H "Content-Type: application/json" \
-H "Accept: application/json, text/event-stream" \
-d '{"jsonrpc":"2.0","method":"tools/list","id":1}'
# β 2008. Updating the App
After a backend code change:
cd /projects/flompt
git pull
supervisorctl -c supervisord.conf restart flompt-backendAfter a frontend change:
cd /projects/flompt
git pull
cd app && npm run build
# No service restart needed β Caddy serves static files directlyAfter a blog change:
cd /projects/flompt
git pull
cd blog && rm -rf .next out && npm run build
# No service restart neededAfter a Caddyfile change:
supervisorctl -c /projects/flompt/supervisord.conf restart flompt-caddyFull redeploy from scratch:
cd /projects/flompt && ./deploy.sh9. Log Files Reference
File | Content |
| FastAPI/uvicorn stdout + stderr |
| Caddy access + error logs |
| supervisord daemon logs |
| keepalive watchdog events |
ποΈ Tech Stack
Layer | Technology |
Frontend | React 18, TypeScript, React Flow v11, Zustand, Vite |
Backend | FastAPI, Python 3.12, Uvicorn |
MCP Server | FastMCP (streamable HTTP transport) |
AI | Anthropic Claude / OpenAI GPT (pluggable) |
Reverse Proxy | Caddy (auto-TLS via Let's Encrypt) |
Process Manager | Supervisord + keepalive watchdog |
Blog | Next.js 15 (static export), Tailwind CSS |
Extension | Chrome & Firefox MV3 (content script + sidebar) |
i18n | 10 languages β EN FR ES DE PT JA TR ZH AR RU |
π Features
π¨ Visual flowchart editor: Drag-and-drop blocks with React Flow
π€ AI-powered decomposition: Paste a prompt, get structured blocks
β‘ Async job queue: Non-blocking decomposition with live progress tracking
π¦Ύ Claude-optimized output: XML structured following Anthropic best practices
π§© Browser extension: Enhance button inside ChatGPT, Claude & Gemini (Chrome & Firefox)
π€ Claude Code MCP: Native tool integration via Model Context Protocol
π± Responsive: Full touch support, tap-to-connect
π Dark theme: Mermaid-inspired warm dark UI
π 10 languages: EN, FR, ES, DE, PT, JA, TR, ZH, AR, RU β each with a dedicated indexed page for SEO
πΎ Auto-save: Local persistence with Zustand
β¨οΈ Keyboard shortcuts: Power-user friendly
π Export: Copy, download as TXT or JSON
π Open-source: MIT licensed, self-hostable
π€ Contributing
Contributions are welcome β bug reports, features, translations, and docs!
Read CONTRIBUTING.md to get started. The full changelog is in CHANGELOG.md.