Utilizes OpenAI's LLM capabilities to plan, parallelize, and execute multi-file AI coding tasks.
Orchex Community
Describe what you want. Orchex plans, parallelizes, and executes — safely.
Community hub and Cursor plugin for Orchex — autopilot AI orchestration for multi-file coding tasks.
What is Orchex?
Orchex is an MCP server that orchestrates AI coding agents in parallel. Give it a task, and it auto-plans streams of work, executes them across waves with ownership enforcement (no file conflicts), and self-heals on failure. Supports 6 LLM providers with BYOK (Bring Your Own Key).
Cursor Plugin
Install from Marketplace
Search for "Orchex" in the Cursor Marketplace, or visit cursor.com/marketplace.
Manual Setup
Add to your Cursor MCP config (~/.cursor/mcp.json):
{
"mcpServers": {
"orchex": {
"command": "npx",
"args": ["-y", "@wundam/orchex"]
}
}
}Orchex auto-detects API keys from your shell environment (ANTHROPIC_API_KEY, OPENAI_API_KEY, GEMINI_API_KEY, DEEPSEEK_API_KEY). No env block needed.
Usage
Open Cursor and ask: "Build a REST API with authentication and tests"
Orchex generates a plan, shows you the streams, and executes on approval
Streams run in parallel waves — each stream owns its files, no conflicts
Community
Use GitHub Discussions for:
Category | Purpose |
Announcements | Release notes, breaking changes |
Bug Reports | Issues you encounter during beta |
Feedback | Feature requests, UX suggestions |
Q&A | Setup help, troubleshooting |
Links
orchex.dev — Website & docs
@wundam/orchex on npm — Package
Integration guides — Claude Code, Cursor, Windsurf, and more