Provides semantic memory and journal capabilities through CloudFlare's infrastructure, using D1 database for structured data, Vectorize for vector search, and AI services for embeddings generation
Built on CloudFlare Workers serverless platform to provide zero-setup semantic memory and journaling with real-time SSE communication
Context MCP Server
A CloudFlare Workers-based Model Context Protocol (MCP) server that provides semantic memory and journal capabilities with zero-setup user experience.
Features
Zero-Setup Experience: Users get unique URLs with no local installation required
Semantic Search: BGE-Base-EN-v1.5 embeddings with vector similarity search
User Isolation: Complete data privacy with user-specific access control
Real-Time Communication: Server-Sent Events (SSE) for live MCP protocol communication
Scalable Architecture: Built on CloudFlare's serverless infrastructure
Core Tools
addMemory
: Store memories with semantic search capabilitiessearchMemory
: Find relevant memories using semantic similarityaddJournal
: Create journal entries with optional titles and tagssearchJournals
: Search journal entries semanticallygetRecentActivity
: Get recent memories and journal entries
Architecture
CloudFlare Workers: Serverless compute for the MCP server
D1 Database: SQLite-based storage for structured data
Vectorize: Vector database for semantic search
CloudFlare AI: BGE-Base-EN-v1.5 embeddings generation
KV Store: Session management and caching
Quick Start
Prerequisites
Node.js 18+ installed
CloudFlare account with Workers, D1, and Vectorize access
Wrangler CLI installed and authenticated
Setup
Clone and Install
Database Setup
This script will:
Create D1 database and update wrangler.toml
Set up database schema with proper indexes
Create Vectorize index for embeddings
Configure KV namespace for sessions
Deploy
Test the Deployment
Optional: Seed Test Data
Usage
For MCP Clients
Connect to your deployed worker using the SSE endpoint:
Example with Claude Desktop
Add to your claude_desktop_config.json
:
Direct HTTP API
You can also use HTTP POST requests to the MCP endpoint:
Tool Reference
addMemory
Store a new memory with semantic search capabilities.
searchMemory
Search memories using semantic similarity.
addJournal
Create a new journal entry.
searchJournals
Search journal entries semantically.
getRecentActivity
Get recent memories and journal entries.
Development
Local Development
This starts a local development server with hot reloading.
Database Operations
Type Checking
Project Structure
Configuration
Environment Variables
Set in wrangler.toml
under [vars]
:
Bindings
The worker uses these CloudFlare bindings:
DB
: D1 Database for structured dataVECTORIZE
: Vector search indexAI
: BGE embeddings generationSESSIONS
: KV namespace for sessions
Security
User Isolation: All data is scoped to user IDs
UUID Validation: Proper user ID format validation
CORS Headers: Configured for cross-origin requests
Error Handling: No sensitive data exposed in errors
Performance
Vector Search: Sub-100ms semantic similarity queries
Database Queries: Optimized with proper indexing
Connection Management: Automatic cleanup of stale SSE connections
Heartbeat: 30-second intervals to maintain connections
Monitoring
Health Check
Connection Status
The SSE handler provides connection monitoring capabilities for debugging.
Logs
View real-time CloudFlare Worker logs.
Troubleshooting
Common Issues
Database not found: Run
npm run setup
to create databaseEmbedding errors: Ensure CloudFlare AI binding is configured
SSE connection issues: Check browser console for connection errors
Vector search returning no results: Verify data was added with embeddings
Debug Steps
Check health endpoint:
https://your-worker.workers.dev/health
Verify user ID format (must be valid UUID)
Check CloudFlare dashboard for binding configuration
Review worker logs:
npm run logs
Contributing
Fork the repository
Create a feature branch
Make changes and test thoroughly
Submit a pull request
License
MIT License - see LICENSE file for details.
Roadmap
Enhanced metadata filtering for vector search
File attachment support for journal entries
Export/import functionality
Advanced analytics and insights
Multi-language embedding support
Real-time collaboration features
Built with ❤️ using CloudFlare Workers and the Model Context Protocol.
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
A CloudFlare Workers-based MCP server that provides semantic memory and journal capabilities with vector search. Enables users to store, search, and retrieve memories and journal entries using AI-powered semantic similarity without any local setup required.