Structured-sh
OfficialAllows ingestion of GitHub webhook events into structured memories for querying and analysis.
Allows ingestion of Shopify webhook events into structured memories for querying and analysis.
Allows sending SQL query results as Slack messages via templates.
Allows ingestion of Stripe webhook events into structured memories for querying and analysis.
Define schemas. Write structured data. Query with SQL. All as Parquet files you own.
docker compose upWhat is this?
Structured gives AI agents (and humans) persistent, queryable memory:
Define a memory — name + typed schema (like a table)
Write records — buffered and auto-flushed to Parquet files
Query with SQL — DuckDB runs directly against the Parquet files
Own your data — everything is local files on disk, no vendor lock-in
Works with Claude Desktop, Cursor, Windsurf, Cline, or any MCP-compatible client.
Architecture
┌──────────────────────────────────────────────────────┐
│ docker compose up │
│ │
│ ┌──────────┐ ┌──────────────┐ ┌────────────┐ │
│ │Dashboard │ │ REST API │ │ MCP Server │ │
│ │ :3000 │───▶│ :3001 │◀───│ stdio │ │
│ │ │ │ │ │ │ │
│ │Vite+React│ │ Hono + WASM │ │ 9 tools │ │
│ └──────────┘ │ │ └────────────┘ │
│ │ SQLite (sql.js) │
│ │ DuckDB (wasm) │
│ │ Parquet (tiny-parquet) │
│ └──────┬───────┘ │
│ │ │
│ ./data/ │
│ ├── structured.db ← metadata │
│ └── parquet/ ← your data │
│ ├── user_prefs/ │
│ │ └── 2026/04/09/...parquet │
│ └── campaign_data/ │
│ └── 2026/04/09/...parquet │
└──────────────────────────────────────────────────────┘Zero native dependencies. Everything runs in WASM — no C++ builds, no platform issues.
Quick Start
1. Clone and configure
git clone https://github.com/structured-sh/structured.git
cd structuredEdit docker-compose.yml and set your credentials:
environment:
- API_KEY=your-secret-api-key # Used by MCP clients & scripts
- DASHBOARD_PASSWORD=your-password # Protects the dashboard UITwo separate credentials:
API_KEY— machine auth for MCP clients, scripts, and analytics ingestion
DASHBOARD_PASSWORD— human auth for the web dashboard. Leave unset to disable login (local-only use).
2. Start the stack
docker compose up -dService | URL |
Dashboard | |
API | |
MCP | stdio (auto-connected) |
3. Create a memory
curl -X POST http://localhost:3001/memories \
-H "Authorization: Bearer your-secret-api-key" \
-H "Content-Type: application/json" \
-d '{
"name": "user_preferences",
"fields": [
{ "name": "key", "type": "string" },
{ "name": "value", "type": "string" },
{ "name": "priority", "type": "int32" }
],
"description": "User preference settings"
}'4. Write data
curl -X POST http://localhost:3001/memories/user_preferences/write \
-H "Authorization: Bearer your-secret-api-key" \
-H "Content-Type: application/json" \
-d '{
"data": [
{ "key": "theme", "value": "dark", "priority": 1 },
{ "key": "language", "value": "en", "priority": 2 }
]
}'5. Query with SQL
curl -X POST http://localhost:3001/query \
-H "Authorization: Bearer your-secret-api-key" \
-H "Content-Type: application/json" \
-d '{ "sql": "SELECT * FROM user_preferences ORDER BY priority" }'Memory names work as table names — DuckDB resolves them to Parquet files automatically.
6. Access raw files
# Files on disk
ls ./data/parquet/user_preferences/
# DuckDB CLI
duckdb -c "SELECT * FROM read_parquet('./data/parquet/user_preferences/**/*.parquet')"
# Python
import duckdb
duckdb.sql("SELECT * FROM './data/parquet/user_preferences/**/*.parquet'").show()Connect to Claude / Cursor
Local (Docker)
Add to ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"structured": {
"command": "docker",
"args": ["exec", "-i", "structured-mcp", "node", "index.js"]
}
}
}For Cursor, add to Settings → Features → MCP Servers:
{
"structured": {
"command": "docker",
"args": ["exec", "-i", "structured-mcp", "node", "index.js"]
}
}Cloud (structured.sh)
Coming soon — hosted version at structured.sh
{
"mcpServers": {
"structured": {
"command": "npx",
"args": ["-y", "@anthropic-ai/mcp-proxy", "https://mcp.structured.sh"]
}
}
}MCP Tools
Once connected, just talk naturally. The AI picks the right tool.
Tool | What to say |
| "Create a memory for tracking daily sales with fields: date, revenue, units" |
| "What memories do I have?" |
| "Show me the schema for daily_sales" |
| "Save today's sales: date=2026-04-09, revenue=1250.50, units=42" |
| "What's the total revenue this month?" |
| "Remember this config for later" |
| "Get the document abc-123" |
| "Flush all pending data to disk" |
| "Delete the test_data memory" |
Ingesting Data from Your Apps
Use the templates in templates/ to send events from external systems:
Template | Use case |
| Mobile/web app events (installs, actions, purchases) |
| Stripe, GitHub, Shopify webhooks |
| Generate SQL reports → terminal, Markdown, or Slack |
// Example: track an install from your iOS app
import { track } from './templates/ingest-app-analytics.js';
await track('install', 'user_abc', { platform: 'ios', app_version: '1.0.0' });Then query across all your events:
SELECT event, COUNT(*) as n, COUNT(DISTINCT user_id) as users
FROM app_events
WHERE timestamp > now() - INTERVAL '30 days'
GROUP BY event ORDER BY n DESCDead Letter Queue
Records rejected by strict or evolve schema modes are not dropped — they're automatically written to _dlq_{memory_name} for inspection:
-- See what was rejected and why
SELECT _reason, _payload, _rejected_at
FROM _dlq_app_events
ORDER BY _rejected_at DESC
LIMIT 20API Key Rotation
Rotate your API key at any time from the dashboard Connect page without restarting:
Go to Connect → API Key section
Click Rotate Key
Copy the new key (shown once)
Update your MCP config and any scripts
The old key is invalidated immediately.
API Reference
Auth (Dashboard)
Method | Path | Description |
|
| Check if dashboard auth is enabled |
|
| Login with |
|
| Validate current session ( |
|
| Logout (client discards token) |
Memories
Method | Path | Description |
|
| List all memories |
|
| Create a memory |
|
| Get memory details |
|
| Update memory |
|
| Delete memory |
|
| Write records |
|
| Flush to Parquet |
|
| Preview data |
|
| List Parquet files |
Query
Method | Path | Description |
|
| Execute DuckDB SQL |
Store (Documents)
Method | Path | Description |
|
| Store a JSON document |
|
| List collections |
|
| List documents |
|
| Delete document |
Settings
Method | Path | Description |
|
| Get masked current API key |
|
| Generate new API key |
Files
Method | Path | Description |
|
| Download raw Parquet file |
MCP (SSE)
Method | Path | Description |
|
| SSE stream for MCP clients |
|
| JSON-RPC messages |
Schema Modes
Mode | Behavior |
| Accept all fields, no validation (default) |
| Accept all, detect and log schema drift |
| Reject records with mismatched fields → DLQ |
Field Types
Type | Parquet Type | Notes |
| BYTE_ARRAY (UTF-8) | Default |
| INT32 | |
| INT64 | |
| FLOAT | |
| DOUBLE | |
| BOOLEAN | |
| INT64 (millis) | Unix ms, UTC |
Stack
Only 7 npm packages. No native addons — everything runs in WASM or pure JS.
Package | What it does |
| HTTP router — fast, Web-standard API |
SQL query engine, runs fully in WASM | |
SQLite in WASM — stores schema metadata | |
Pure-JS Parquet writer, zero native deps | |
MCP server/tool registration | |
Schema validation for API inputs |
Runtime: Node.js ≥ 20
Because everything runs in WASM, there are no C++ builds, no node-gyp, no platform-specific binaries. The Docker image works on any architecture Docker supports.
Development
# Local dev (no Docker)
cd api && npm install && node index.js
cd dashboard && npm install && npm run dev
# Full stack
docker compose up --buildLicense
Apache 2.0
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/structured-sh/structured'
If you have feedback or need assistance with the MCP directory API, please join our Discord server