Integrates with Google Cloud Vertex AI for accessing AI models (DeepSeek, Kimi, MiniMax) in a multi-candidate agentic coding pipeline with planning, code generation, and quality gating.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Petamind MCPgenerate a React component for a user profile card with dark mode support"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Petamind MCP
A Claude Code MCP server for a multi-candidate agentic coding loop: reasoner plan → generate patches → deterministic gates → mandatory vision scoring → pick the best winner.
Poetiq-style refinement loop (descriptive, not affiliated): This project uses “Poetiq-style” descriptively to refer to iterative refinement loops (generate → critique → refine → verify). It is not affiliated with Poetiq.
Setup guide: docs/MCP_PETAMIND_MCP.md.
Vertex setup: docs/VERTEX_SETUP.md.
Troubleshooting: docs/TROUBLESHOOTING.md.
MCP Quick Start (Claude Code)
Option A (recommended): install from PyPI via pipx
Then add the MCP server to Claude Code (user scope):
Notes:
petamind-setupinstalls Playwright Chromium (required for the mandatory vision loop).You do not need Google Cloud credentials to use
petamind_eval_patchwithvision_provider=client(default).
Option B: install from a git clone (contributors / hacking)
From this repo root:
Then follow docs/MCP_PETAMIND_MCP.md to add the server to Claude Code via .mcp.json or claude mcp add-json.
Minimal Claude Code config (user scope)
Included: Synthetic UI Dataset Factory
This repo also includes a production-grade synthetic dataset generator for UI/UX design tasks (landing pages, directories, dashboards) using Next.js App Router + TypeScript + Tailwind.
Features
Multi-model pipeline: Uses Vertex AI (DeepSeek, Kimi, MiniMax) and OpenRouter (Devstral, vision models)
Quality gating: Only winners pass through to training data (build success + vision score threshold)
Resumable: SQLite caching for model responses, task state persistence
Two output tracks:
public/(publishable models only) andprivate/(all models)No contamination: Chain-of-thought/thinking never stored; only structured specs + code
Claude Code MCP (agentic coding)
This repo also ships an MCP server (petamind-mcp) that exposes a multi-candidate
patch/test/vision loop to Claude Code. Setup guide: docs/MCP_PETAMIND_MCP.md.
Quick Start
1. Environment Setup
2. Configure Environment Variables
Required:
GOOGLE_CLOUD_PROJECT: Your GCP project IDGOOGLE_CLOUD_REGION: Region for Vertex AI (e.g.,us-central1)OPENROUTER_API_KEY: Your OpenRouter API key
Optional:
GCS_BUCKET: For cloud backup of outputs
3. Authenticate with Google Cloud
4. Run
Configuration
Edit config/config.yaml to customize:
Pipeline Stages
Niche/Task Generation: Creates 100 niches × 7 tasks = 700+ tasks
Planning: DeepSeek generates UI_SPEC JSON for each task
UI Generation: Kimi + MiniMax generate code candidates (2 variants each)
Validation: Next.js build with Devstral-powered fix loops
Rendering: Playwright captures screenshots at 3 viewport sizes
Scoring: Vision judge (or heuristic fallback) scores candidates
Selection: Best candidate per task selected for training
Export: Winners exported to train.jsonl / valid.jsonl
Output Structure
Training Data Format
Each line in train.jsonl:
Page Types Covered
landing: Marketing landing pagesdirectory_home: Directory homepage with searchcity_index: City-specific listing pagescategory_index: Category-specific listing pageslisting_profile: Individual listing detail pagesadmin_dashboard: Admin/analytics dashboardsedit: Refactor/edit tasks (20% of dataset)
Development
Architecture Notes
Provider abstraction: Clean interface for Vertex AI and OpenRouter
Deterministic IDs: Tasks have stable IDs from hash(niche_id + page_type + seed)
JSON strictness: Safe extraction with fallback parsing
Async throughout: Uses asyncio for concurrent model calls
No thinking storage: Only structured UI_SPEC and final code stored
License
MIT