Support platform for project donations and community contributions
Supports Docker deployment for Qdrant vector database to enhance semantic search capabilities
Source code repository hosting with issue tracking and community discussions for the open-source version
Commercial bundle distribution platform providing pre-compiled components and priority support
Alternative support platform for project donations and community contributions
Uses SQLite database with FTS5 for persistent storage and full-text search of memories and metadata
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@BuildAutomata Memory MCP Serversearch for my notes about neural network architectures"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
BuildAutomata Memory MCP Server
Persistent, versioned memory system for AI agents via Model Context Protocol (MCP)
What is This?
BuildAutomata Memory is an MCP server that gives AI agents (like Claude) persistent, searchable memory that survives across conversations. Think of it as giving your AI a long-term memory system with:
π§ Semantic Search - Find memories by meaning, not just keywords
π Temporal Versioning - Complete history of how memories evolve
π·οΈ Smart Organization - Categories, tags, importance scoring
π Cross-Tool Sync - Share memories between Claude Desktop, Claude Code, Cursor AI
πΎ Persistent Storage - SQLite + optional Qdrant vector DB
Quick Start
Prerequisites
Python 3.10+
Claude Desktop (for MCP integration) OR any MCP-compatible client
Installation
Clone this repository
git clone https://github.com/brucepro/buildautomata_memory_mcp.git
cd buildautomata_memory_mcp-mainInstall dependencies
pip install mcp qdrant-client sentence-transformersConfigure Claude Desktop
Edit your Claude Desktop config (AppData/Roaming/Claude/claude_desktop_config.json on Windows):
{
"mcpServers": {
"buildautomata-memory": {
"command": "python",
"args": ["C:/path/to/buildautomata_memory_mcp_dev/buildautomata_memory_mcp.py"]
}
}
}Restart Claude Desktop
That's it! The memory system will auto-create its database on first run.
CLI Usage (Claude Code, Scripts, Automation)
In addition to the MCP server, this repo includes interactive_memory.py - a CLI for direct memory access:
# Search memories
python interactive_memory.py search "consciousness research" --limit 5
# Store a new memory
python interactive_memory.py store "Important discovery..." --category research --importance 0.9 --tags "ai,insight"
# View memory evolution
python interactive_memory.py timeline --query "project updates" --limit 10
# Get statistics
python interactive_memory.py statsSee README_CLI.md for complete CLI documentation.
Quick Access Scripts
Windows:
memory.bat search "query"
memory.bat store "content" --importance 0.8Linux/Mac:
./memory.sh search "query"
./memory.sh store "content" --importance 0.8Features
Core Capabilities
Hybrid Search: Combines vector similarity (Qdrant) + full-text search (SQLite FTS5)
Temporal Versioning: Every memory update creates a new version - full audit trail
Smart Decay: Importance scores decay over time based on access patterns
Rich Metadata: Categories, tags, importance, custom metadata
LRU Caching: Fast repeated access with automatic cache management
Thread-Safe: Concurrent operations with proper locking
MCP Tools Exposed
When running as an MCP server, provides these tools to Claude:
store_memory- Create new memoryupdate_memory- Modify existing memory (creates new version)search_memories- Semantic + full-text search with filtersget_memory_timeline- View complete version historyget_memory_stats- System statisticsprune_old_memories- Cleanup old/low-importance memoriesrun_maintenance- Database optimization
Architecture
βββββββββββββββββββ
β Claude Desktop β
β (MCP Client) β
ββββββββββ¬βββββββββ
β
ββββββΌββββββββββββββββββββββ
β MCP Server β
β buildautomata_memory β
ββββββ¬ββββββββββββββββββββββ
β
ββββββΌβββββββββββ
β MemoryStore β
ββββββ¬βββββββββββ
β
ββββββ΄βββββ¬ββββββββββββββ¬βββββββββββββββ
βΌ βΌ βΌ βΌ
βββββββββ ββββββββββ ββββββββββββ βββββββββββββββ
βSQLite β βQdrant β βSentence β β LRU Cache β
β FTS5 β βVector β βTransform β β (in-memory) β
βββββββββ ββββββββββ ββββββββββββ βββββββββββββββUse Cases
1. Persistent AI Context
User: "Remember that I prefer detailed technical explanations"
[Memory stored with category: user_preference]
Next session...
Claude: *Automatically recalls preference and provides detailed response*2. Project Continuity
Session 1: Work on project A, store progress
Session 2: Claude recalls project state, continues where you left off
Session 3: View timeline of all project decisions3. Research & Learning
- Store research findings as you discover them
- Tag by topic, importance, source
- Search semantically: "What did I learn about neural networks?"
- View how understanding evolved over time4. Multi-Tool Workflow
Claude Desktop β Stores insight via MCP
Claude Code β Retrieves via CLI
Cursor AI β Accesses same memory database
= Unified AI persona across all toolsWant the Complete Bundle?
π Get the Gumroad Bundle
The Gumroad version includes:
β Priority support via email
β Project support Help the project get funding.
This open-source version:
β Free for personal/educational/small business use (<$100k revenue)
β Full source code access
β Community support via GitHub issues
Both versions use the exact same core code - you're just choosing between project support (Gumroad) vs DIY (GitHub).
Configuration
Environment Variables
# User/Agent Identity
BA_USERNAME=buildautomata_ai_v012 # Default user ID
BA_AGENT_NAME=claude_assistant # Default agent ID
# Qdrant (Vector Search)
QDRANT_HOST=localhost # Qdrant server host
QDRANT_PORT=6333 # Qdrant server port
# System Limits
MAX_MEMORIES=10000 # Max memories before pruning
CACHE_MAXSIZE=1000 # LRU cache size
QDRANT_MAX_RETRIES=3 # Retry attempts
MAINTENANCE_INTERVAL_HOURS=24 # Auto-maintenance intervalDatabase Location
Memories are stored at:
<script_dir>/memory_repos/<username>_<agent_name>/memoryv012.dbQdrant:
We are now using embedded qdrant. You can overide this setting by running your own server.
Optional: Qdrant Setup
For enhanced semantic search (highly recommended):
Option 1: Docker
docker run -p 6333:6333 qdrant/qdrantOption 2: Manual Install
Download from Qdrant Releases
Troubleshooting
"Permission denied" on database
Check
memory_repos/directory permissionsOn Windows: Run as administrator if needed
Claude Desktop doesn't show tools
Check
claude_desktop_config.jsonpath is correctVerify Python is in system PATH
Restart Claude Desktop completely
Check logs in Claude Desktop β Help β View Logs
Import errors
pip install --upgrade mcp qdrant-client sentence-transformersLicense
Open Source (This GitHub Version):
Free for personal, educational, and small business use (<$100k annual revenue)
Must attribute original author (Jurden Bruce)
See LICENSE file for full terms
Commercial License:
Companies with >$100k revenue: $200/user or $20,000/company (whichever is lower)
Contact: sales@brucepro.net
Support
Community Support (Free)
GitHub Issues: Report bugs or request features
Discussions: Ask questions, share tips
Priority Support (Gumroad Customers)
Email: sales@brucepro.net
Faster response times
Setup assistance
Custom configuration help
Contributing
Contributions welcome! Please:
Fork the repository
Create a feature branch
Make your changes
Submit a pull request
Credits
Author: Jurden Bruce Project: BuildAutomata Year: 2025
Built with:
MCP - Model Context Protocol
Qdrant - Vector database
Sentence Transformers - Embeddings
SQLite - Persistent storage
See Also
Gumroad Bundle - Easy setup version
Star this repo β if you find it useful! Consider the if you want to support development.