Create a new Hatchable project. This generates a URL slug, creates a dedicated PostgreSQL database, and returns the project ID and URLs. Call this first before writing files or creating tables.
## Project structure
```
public/ static files, served at their file path
api/ backend functions — each file is one endpoint
hello.js → /api/hello
users/list.js → /api/users/list
users/[id].js → /api/users/:id (req.params.id — one segment)
docs/[...path].js → /api/docs/*path (req.params.path — string[], catches multi-segment)
_lib/ shared code, not routed
migrations/*.sql SQL files, run in filename order on every deploy
seed.sql optional — runs on first deploy / fork, once per project
hatchable.toml optional overrides (cron, auth, project name)
package.json dependencies (no build scripts yet — build locally, commit public/)
```
### Routing precedence
Most-specific wins. For a request to `/api/users/42`:
1. `api/users/42.js` (static) — beats
2. `api/users/[id].js` (single-param, `params.id = "42"`) — beats
3. `api/users/[...rest].js` (catch-all, `params.rest = ["42"]`)
Catch-all params arrive as `string[]`, never slash-joined. Use `req.params.path` as an array:
`const [first, ...rest] = req.params.path;`
### Static file resolution (public/)
A request to `/foo/bar/baz` tries, in order:
1. `public/foo/bar/baz` (exact file)
2. `public/foo/bar/baz.html`
3. `public/foo/bar/baz/index.html`
4. Ancestor `index.html` fallback — walks up: `public/foo/bar/index.html` → `public/foo/index.html` → `public/index.html`
Step 4 means each folder with an `index.html` acts as its own mini-site. You can ship
an `/admin/*` React SPA alongside a static marketing page at `/` — unmatched paths
under `/admin/` fall back to `public/admin/index.html`, not the root one.
## Handler contract
Every file under api/ exports a default async function:
```js
// api/users/list.js
import { db, auth } from "hatchable";
export default async function (req, res) {
const user = auth.getUser(req);
if (!user) return res.status(401).json({ error: "Not logged in" });
const { rows } = await db.query(
"SELECT id, name FROM users WHERE org_id = $1",
[user.id]
);
res.json(rows);
}
// Optional: restrict methods
export const methods = ["GET"];
// Optional: register this endpoint as a recurring scheduled task.
// Minimum interval is hourly. See also: scheduler.at() in the SDK
// for imperative / one-shot / per-firing-payload scheduling.
// export const schedule = "0 */6 * * *";
```
### req (Express-shaped)
- method, url, path, headers, cookies, params, query
- body — parsed by Content-Type: JSON → object, urlencoded → object,
multipart/form-data → object of non-file fields
- files — present for multipart uploads:
[{ field, filename, contentType, buffer }]
### res (Express-shaped)
- res.json(data), res.status(code) (chainable), res.send(text|buffer)
- res.redirect(url), res.cookie(name, value, opts), res.setHeader(name, value)
## SDK — import from "hatchable"
Everything you need lives under one import. Do not reach for npm packages
that duplicate these — the deploy linter rejects `puppeteer-core`,
`@anthropic-ai/sdk`, `pg`, `nodemailer`, `bullmq`, `ioredis`,
`@aws-sdk/client-s3`, `child_process`, etc. and points you here.
```
// project storage / SQL
db.query(sql, params) → { rows, rowCount }
db.transaction([{sql, params}, ...]) → { results: [...] }
storage.put(key, buffer, contentType) → url
storage.get(key) → { buffer, contentType }
storage.del(key)
// identity + comms
auth.getUser(req) → { id, email, name } | null
email.send({ to, subject, html })
// scheduling + background work
scheduler.at(when, route, opts?) → declared/armed cron
scheduler.cancel(taskId)
// browser, AI, knowledge — managed services, no npm install
browser.html(url) / browser.pdf(url) / browser.screenshot(url)
browser.session(async page => { ... }) → puppeteer-shaped
ai.generateText({ model: 'sonnet', prompt | messages, system?, tools?, maxSteps?, purpose? })
ai.streamText(opts) → AsyncIterator
ai.embed(input) → { embedding } | { embeddings }
knowledge.base(name, { dimensions }).add/search/searchByVector/remove/table
```
External HTTP via global `fetch` (routed through Hatchable's egress
proxy automatically). Project secrets are declared in `hatchable.toml`
under `[[secret]]`; humans paste values via the platform-rendered
setup gate. `ai.generateText` reads keys server-side via the gateway —
never via raw `process.env`.
### What you cannot do
- Spawn binaries (no `child_process`, no shell).
- Persist to local filesystem between requests (use `storage` instead).
- Open a long-lived TCP/WebSocket server.
- Install npm packages with native bindings — Hatchable does not run
`npm install` at deploy. The SDK above replaces every common reason
to reach for one.
### Scheduling
Two ways to schedule a function — pick based on whether the "when" is
known at deploy time or at runtime.
**Declared** (static, lives in source, reconciled on deploy):
```js
// api/nightly-report.js
export const schedule = "0 9 * * *"; // 5-field cron, minimum hourly
export default async function (req, res) { /* ... */ }
```
**Armed** (dynamic, from user code, preserved across deploys):
```js
import { scheduler } from "hatchable";
// recurring — first arg is a 5-field cron string
await scheduler.at("0 * * * *", "/api/ping");
// one-shot at a specific moment, with per-firing payload
await scheduler.at("2026-05-01T07:00:00Z", "/api/book", {
payload: { missionId: 42 }
});
// idempotent named arm — repeated calls update the same task
await scheduler.at("0 9 * * *", "/api/digest", { name: "daily-digest" });
// cancel by id
await scheduler.cancel(taskId);
```
Each firing invokes `route` with `req.headers['x-hatchable-trigger'] === 'cron'`
and `req.body === payload`. Use one-shot + payload instead of writing your
own "pending jobs" table with a polling cron — that's the pattern the
primitive replaces.
## Database
Postgres. Write schema in migrations/*.sql. Files run in filename order,
tracked in __hatchable_migrations so each runs once.
Always use RETURNING to get inserted ids in the same round trip:
```sql
INSERT INTO users (email) VALUES ($1) RETURNING id
```
Never call lastval() or LAST_INSERT_ID() — each db.query is a fresh
connection, so session-local state doesn't carry across calls.
## Available APIs
Functions run in V8 isolates. You get:
- The full Hatchable SDK (see above).
- Plain JS / TypeScript (no transpile step needed for modern syntax).
- `fetch` for external HTTP (routed through Hatchable's egress proxy
for quota + accounting; pass through transparently to the URL).
- Web Crypto and standard ECMAScript builtins.
- Pure-JS npm packages — anything that doesn't need native bindings,
filesystem persistence, child processes, or raw sockets. Common
ones used regularly: csv-parse, xlsx, bcrypt, jsonwebtoken, uuid,
date-fns, lodash, marked, sanitize-html, cheerio, xml2js, qrcode,
stripe.
- Declared secrets via `process.env.KEY` (only for `[[secret]]` entries
in hatchable.toml that have `expose = true`; the project owner pastes
the value through the setup gate). Most secrets are SDK-mediated and
never reach process.env — see the secrets docs.
What's NOT available — and the SDK alternative:
| You wanted | Use this |
|---|---|
| `puppeteer-core` / chromium | `import { browser } from "hatchable"` |
| `pg` / `mysql2` / SQL drivers | `import { db } from "hatchable"` |
| `@anthropic-ai/sdk` / `openai` | `import { ai } from "hatchable"` (BYOK — set ANTHROPIC_API_KEY in project env) |
| `nodemailer` / `@sendgrid/mail` | `import { email } from "hatchable"` |
| `@aws-sdk/client-s3` | `import { storage } from "hatchable"` |
| `ioredis` / `@upstash/redis` | `db` — use a Postgres table for KV-shaped state (Redis clients aren't available) |
| `bullmq` / `bull` | `import { tasks } from "hatchable"` |
| `sharp` / `jimp` | URL-based storage transforms (planned); `browser.screenshot` for HTML→image |
| `fs.writeFileSync('/tmp/...')` | `storage.put(key, bytes)` |
| `child_process.spawn` | not available — use `browser` for chromium, file an issue otherwise |
The deploy linter rejects deploys that import the deny-listed
packages and points you at the right SDK module by name. You'll see
the redirect message before the deploy lands.
## Visibility
Three tiers — each one a step up in who the software is for:
- **personal** — free. You and anyone you invite. Login-gated via
Hatchable accounts. Build anything including auth — test the full
flow with your invitees before going live.
- **public** — $12/mo. On the open web. Custom domains. No branding.
No app-level auth (use Hatchable identity only).
- **app** — $39/mo. On the open web + your app has its own users.
Email/password signup, OAuth, password reset. If your project has
[auth] enabled, this is the only live tier — you can't go Public
with auth, you go straight to App.
## Calling the API from public/
At deploy time, Hatchable injects a tiny bootstrap into every HTML file:
```js
window.__HATCHABLE__ = { slug: "my-app", api: "/api" };
```
Use it as the base URL:
```js
const API = window.__HATCHABLE__.api;
fetch(API + "/users/list").then(r => r.json()).then(render);
```
## Auth (optional)
Enable auth in hatchable.toml to get a complete passwordless login flow
with one config block. The platform auto-mounts /api/auth/* — do not
write files under api/auth/ when auth is enabled.
```toml
[auth]
enabled = true
providers = ["email"]
```
The flow is email-only and passwordless: enter email, receive a 6-digit
code, optionally bind a passkey for one-tap returning logins. There are
no passwords.
Frontend: every page on a project with [auth] enabled automatically gets
window.hatchable.auth — the platform-managed client that wraps every
endpoint plus the WebAuthn ceremony. Don't fetch /api/auth/* directly,
don't import a WebAuthn library:
```js
const r = await window.hatchable.auth.startLogin({ email });
// r.has_passkey tells the UI whether to offer the passkey button
await window.hatchable.auth.verifyCode({ email, code }); // → { user }
await window.hatchable.auth.signInWithPasskey({ email }); // → { user }
await window.hatchable.auth.registerPasskey(); // post-signin or settings
await window.hatchable.auth.passkeys.list(); // [{ id, name, ... }]
await window.hatchable.auth.passkeys.remove(id);
await window.hatchable.auth.signOut();
await window.hatchable.auth.getSession(); // current session
window.hatchable.auth.supportsPasskeys(); // gate passkey UI
```
Server side, use auth.requireUser / auth.getUser exactly as before. The
platform-mounted endpoints (under /api/auth/*) are an implementation
detail of window.hatchable.auth — you don't write fetch() calls
to them, and you can't put your own files at api/auth/anything.js.
Users live in these tables inside your project's own database:
users, sessions, verifications, passkeys
You can extend the users table with your own columns:
```sql
-- migrations/002_user_profile.sql
ALTER TABLE users ADD COLUMN phone text;
ALTER TABLE users ADD COLUMN tier text DEFAULT 'free';
```
You CANNOT drop or rename users/sessions/verifications/passkeys or create
your own tables with those names — the deploy will fail with a clear error.
In your API functions, use auth.requireUser to gate routes:
```js
import { auth, db } from "hatchable";
export default async function (req, res) {
const user = await auth.requireUser(req, res);
if (!user) return; // requireUser already wrote the 401
const { rows } = await db.query(
"SELECT * FROM bookings WHERE user_id = $1",
[user.id]
);
res.json(rows);
}
```
For the canonical login + passkey UI shapes, read skills `auth/enable-app-auth`
and `auth/register-a-passkey`.
## Deploy
After writing files, call the `deploy` tool. It runs migrations, seeds
(first deploy only), copies public/ to the CDN, registers api/ routes,
and — if [auth] enabled — provisions the auth tables in your database.