Provides integration with GitHub for issue tracking and discussions, with links to the project's GitHub repository for support and contributions.
Provides cross-platform support including Linux for deployment of the MCP server with memory and task management capabilities.
Supports deployment on macOS systems with full functionality for memory management and task tracking.
Built with Python, offering cross-platform compatibility and pythonic interfaces for memory and task management tools.
Uses SQLite for reliable file-based storage with automatic schema migrations, comprehensive indexing, and built-in backup and optimization tools.
Enhanced MCP Memory
⚡ Optimized for Claude Sonnet 4 - This MCP server works best with Claude Sonnet 4 for optimal performance and AI-powered features.
An enhanced MCP (Model Context Protocol) server for intelligent memory and task management, designed for AI assistants and development workflows. Features semantic search, automatic task extraction, knowledge graphs, and comprehensive project management.
✨ Key Features
🧠 Intelligent Memory Management
Semantic search using sentence-transformers for natural language queries
Automatic memory classification with importance scoring
Duplicate detection and content deduplication
File path associations for code-memory relationships
Knowledge graph relationships with automatic similarity detection
🧬 Sequential Thinking Engine
Structured reasoning chains with 5-stage process (analysis, planning, execution, validation, reflection)
Context management with automatic token optimization
Conversation continuity across sessions with intelligent summarization
Real-time token estimation and compression (30-70% reduction)
Auto-extraction of key points, decisions, and action items
📋 Advanced Task Management
Auto-task extraction from conversations and code comments
Priority and category management with validation
Status tracking (pending, in_progress, completed, cancelled)
Task-memory relationships in knowledge graph
Project-based organization
Complex task decomposition into manageable subtasks
🏗️ Project Convention Learning
Automatic environment detection - OS, shell, tools, and runtime versions
Project type recognition - Node.js, Python, Rust, Go, Java, MCP servers, etc.
Command pattern learning - Extracts npm scripts, Makefile targets, and project commands
Tool configuration detection - IDEs, linters, CI/CD, build tools, and testing frameworks
Dependency management - Package managers, lock files, and installation commands
Smart command suggestions - Corrects user commands based on project conventions
Windows-specific optimizations - Proper path separators and command formats
Memory integration - Stores learned conventions for AI context and future reference
📊 Performance Monitoring
Performance monitoring with detailed metrics
Health checks and system diagnostics
Automatic cleanup of old data and duplicates
Database optimization tools
Comprehensive logging and error tracking
Token usage analytics and optimization recommendations
� Enterprise Features
Performance monitoring with detailed metrics
Health checks and system diagnostics
Automatic cleanup of old data and duplicates
Database optimization tools
Comprehensive logging and error tracking
Token usage analytics and optimization recommendations
�🚀 Easy Deployment
uvx compatible for one-command installation
Zero-configuration startup with sensible defaults
Environment variable configuration
Cross-platform support (Windows, macOS, Linux)
🏗️ Project Structure
🚀 Quick Start
Option 1: Using uvx (Recommended)
Option 2: Manual Installation
Option 3: Development Setup
⚙️ MCP Configuration
Add to your MCP client configuration:
For uvx installation:
For local installation:
🛠️ Available Tools
Core Memory Tools
get_memory_context(query)
- Get relevant memories and contextcreate_task(title, description, priority, category)
- Create new tasksget_tasks(status, limit)
- Retrieve tasks with filteringget_project_summary()
- Get comprehensive project overview
Sequential Thinking Tools
start_thinking_chain(objective)
- Begin structured reasoning processadd_thinking_step(chain_id, stage, title, content, reasoning)
- Add reasoning stepsget_thinking_chain(chain_id)
- Retrieve complete thinking chainlist_thinking_chains(limit)
- List recent thinking chains
Context Management Tools
create_context_summary(content, key_points, decisions, actions)
- Compress context for token optimizationstart_new_chat_session(title, objective, continue_from)
- Begin new conversation with optional continuationconsolidate_current_session()
- Compress current session for handoffget_optimized_context(max_tokens)
- Get token-optimized contextestimate_token_usage(text)
- Estimate token count for planning
Enterprise Auto-Processing
auto_process_conversation(content, interaction_type)
- Extract memories and tasks automaticallydecompose_task(prompt)
- Break complex tasks into subtasks
Project Convention Tools
auto_learn_project_conventions(project_path)
- Automatically detect and learn project patternsget_project_conventions_summary()
- Get formatted summary of learned conventionssuggest_correct_command(user_command)
- Suggest project-appropriate command correctionsremember_project_pattern(pattern_type, pattern, description)
- Manually store project patternsupdate_memory_context()
- Refresh memory context with latest project conventions
System Management Tools
health_check()
- Check server health and connectivityget_performance_stats()
- Get detailed performance metricscleanup_old_data(days_old)
- Clean up old memories and tasksoptimize_memories()
- Remove duplicates and optimize storageget_database_stats()
- Get comprehensive database statistics
🏗️ Project Convention Learning
The Enhanced MCP Memory Server automatically learns and remembers project-specific conventions to prevent AI assistants from suggesting incorrect commands or approaches:
Automatic Detection
Operating System: Windows vs Unix, preferred shell and commands
Project Type: Node.js, Python, Rust, Go, Java, MCP servers, FastAPI, Django
Development Tools: IDEs, linters, formatters, CI/CD configurations
Package Management: npm, yarn, pip, poetry, cargo, go modules
Build Systems: Vite, Webpack, Make, batch scripts, shell scripts
Smart Command Suggestions
Windows Optimization
Automatically detects Windows environment
Uses
cmd.exe
and Windows-appropriate path separatorsSuggests Windows-compatible commands (e.g.,
dir
instead ofls
)Handles Windows-specific Python and Node.js patterns
Memory Integration
All learned conventions are stored as high-importance memories that:
Appear in AI context for every interaction
Persist across sessions and project switches
Include environment warnings and project-specific guidance
Prevent repeated incorrect command suggestions
🔧 Configuration Options
Configure via environment variables:
Variable | Default | Description |
|
| Logging level (DEBUG, INFO, WARNING, ERROR) |
|
| Maximum memories per project |
|
| Token threshold for auto-compression |
|
| Auto-cleanup interval |
|
| Enable automatic cleanup |
|
| Max concurrent requests |
|
| Request timeout in seconds |
| ~/ClaudeMemory | Where to store data and logs |
🧪 Testing
This package is production-ready and does not include a test suite in the distributed version. For development or CI, refer to the repository for test scripts and additional resources.
📊 Performance & Monitoring
The server includes built-in performance tracking:
Response time monitoring for all tools
Success rate tracking with error counts
Memory usage statistics
Database performance metrics
Automatic health checks
Access via the get_performance_stats()
and health_check()
tools.
🗄️ Database
SQLite for reliable, file-based storage
Automatic schema migrations for updates
Comprehensive indexing for fast queries
Built-in backup and optimization tools
Cross-platform compatibility
Default location: ./data/mcp_memory.db
🔍 Semantic Search
Powered by sentence-transformers for intelligent memory retrieval:
Natural language queries - "Find memories about database optimization"
Similarity-based matching using embeddings
Configurable similarity thresholds
Automatic model downloading (~90MB on first run)
🧠 Sequential Thinking
Structured reasoning system:
5-stage thinking process: Analysis → Planning → Execution → Validation → Reflection
Token optimization: Real-time estimation and compression (30-70% reduction)
Context continuity: Intelligent session handoffs with preserved context
Auto-extraction: Automatically identifies key points, decisions, and action items
Performance tracking: Monitor reasoning chains and optimization metrics
💼 Token Management
Advanced context optimization for high-scale deployments:
Smart compression: Pattern-based extraction preserves essential information
Token estimation: Real-time calculation for planning and budgeting
Context summarization: Automatic conversion of conversations to actionable summaries
Session consolidation: Seamless handoffs between conversation sessions
Performance analytics: Detailed metrics on compression ratios and response times
📝 Logging
Comprehensive logging system:
Daily log rotation in
./logs/
directoryStructured logging with timestamps and levels
Performance tracking integrated
Error tracking with stack traces
🤝 Contributing
Fork the repository
Create a feature branch
Add tests for new functionality
Ensure all tests pass
Submit a pull request
📄 License
MIT License - see LICENSE file for details.
🆘 Support
Issues: GitHub Issues
Documentation: README
Discussions: GitHub Discussions
🏷️ Version History
v2.0.2 - Updated package build configuration and license compatibility fixes
v2.0.1 - Enhanced features with sequential thinking and project conventions
v1.2.0 - Enhanced MCP server with performance monitoring and health checks
v1.1.0 - Added semantic search and knowledge graph features
v1.0.0 - Initial release with basic memory and task management
This server cannot be installed
local-only server
The server can only run on the client's local machine because it depends on local resources.
An enhanced MCP server that provides intelligent memory and task management for AI assistants, featuring semantic search, automatic task extraction, and knowledge graphs to help manage development workflows.
Related MCP Servers
- AsecurityAlicenseAqualityA Model Context Protocol server that enables AI assistants to interact with Linear project management systems, allowing users to retrieve, create, and update issues, projects, and teams through natural language.Last updated -42505105MIT License
- AsecurityFlicenseAqualityAn MCP server that supercharges AI assistants with powerful tools for software development, enabling research, planning, code generation, and project scaffolding through natural language interaction.Last updated -112779
- -securityFlicense-qualityA MCP server that allows AI assistants to interact with the browser, including getting page content as markdown, modifying page styles, and searching browser history.Last updated -82
- AsecurityAlicenseAqualityA powerful MCP server that provides interactive user feedback and command execution capabilities for AI-assisted development, featuring a graphical interface with text and image support.Last updated -139MIT License