Enables the creation of webhook processing pipelines to receive, transform, and validate data from GitHub.
Supports the deployment and management of serverless JavaScript functions and API endpoints on the Codehooks.io platform.
Enables the creation of webhook processing pipelines to receive, transform, and validate data from Stripe.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Codehooks.io MCP ServerDeploy a serverless JS function to handle new user signups."
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Codehooks.io MCP Server
An MCP (Model Context Protocol) server that provides AI agents with database operations, serverless code deployment, and file management capabilities on the Codehooks.io platform.
Available functionality
Database & Collections
Query and update collections (including metadata) with filters and sorting
Create and manage collections
Import/export data (JSON,JSONL,CSV)
Add schemas and indexes, cap collections
Code Deployment
Deploy JavaScript serverless functions
File Operations
Upload files to cloud storage
List and browse files
Delete files
Inspect file metadata
Key-Value Store
Store key-value pairs
Retrieve one or many key-value pairs
Delete key-value pairs
Set time-to-live (TTL) for key-value pairs
System Operations
View application logs
Access API documentation (local documentation for the MCP agent)
Setup
Get Codehooks Admin Token (keep it secret!)
Create MCP Server Script
Create a folder for your MCP server scripts:
For macOS/Linux - Create codehooks.sh:
Make it executable:
For Windows - Create codehooks.bat:
Replace your_project_name, your_admin_token, and your_space_name with your actual values.
Configure for Claude Desktop
Add to your claude_desktop_config.json:
macOS/Linux:
Windows:
Configure for Cursor
Add to your ~/.cursor/mcp.json:
macOS/Linux:
Windows:
Replace username with your actual username.
Example Requests
"Build a complete survey system: create a database, deploy an API to collect responses, and add search/analytics endpoints"
"Set up a real-time inventory tracker: import my product CSV, create stock update webhooks, and build low-stock alerts"
"Build a webhook processing pipeline: receive webhooks from multiple sources, transform and validate data, then trigger automated actions"
"Build a content management system: create file upload endpoints, set up a metadata database, and deploy content delivery APIs"
"Set up automated data backups: export my collections to JSON files, store them with timestamps, and create restoration endpoints"
How These Examples Work
Complete Survey System
The AI agent would:
Create collections (
surveys,responses) for data storageAdd schemas for data validation and structure
Deploy JavaScript endpoints like
POST /surveysandGET /surveys/:id/analyticsCreate indexes on response fields for fast searching and analytics
Real-time Inventory Tracker
The AI agent would:
Import your CSV to populate the
productscollectionDeploy webhook handlers for
POST /inventory/updateandGET /inventory/low-stockSet up key-value storage for alert thresholds and settings
Create indexes on SKU and stock levels for real-time queries
Webhook Processing Pipeline
The AI agent would:
Deploy webhook receivers like
POST /webhooks/stripeandPOST /webhooks/githubCreate collections for
webhook_logs,processed_events, andfailed_eventsSet up data transformation rules and validation schemas for each webhook source
Use key-value store for rate limiting and duplicate detection with TTL
Deploy action triggers that send emails, update databases, or call other APIs based on webhook data
Content Management System
The AI agent would:
Create collections for
content,media, andusersDeploy file upload endpoints with
POST /uploadandGET /content/:idUpload and manage static files for content delivery
Store metadata linking files to content records with search indexes
Automated Data Backups
The AI agent would:
Export collections to JSON format with timestamps
Upload backup files to cloud storage automatically
Deploy restoration APIs like
GET /backupsandPOST /restore/:backup-idStore backup metadata in key-value store for tracking and management
Each example demonstrates how multiple MCP tools work together to create complete, production-ready systems through natural conversation with your AI agent.
Security Researchers
We thank the following individuals for responsible disclosure and helping improve the security of this project:
Liran Tal – Reported a command injection vulnerability in the
query_collectiontool (May 2025)
License
This project is licensed under the MIT License.