projectbrief.mdโข3.29 kB
# Project Brief: ACE MCP Server
## Project Overview
**Project Name:** ACE MCP Server (Agentic Context Engineering Model Context Protocol Server)
**Project ID:** ace-mcp-001
**Created:** 2025-10-28
**Status:** Initialization Phase
## Purpose
Implementation of the ACE (Agentic Context Engineering) framework as a Model Context Protocol (MCP) server for Cursor AI and other MCP-compatible clients. The system provides self-improving context management through incremental delta updates, semantic deduplication, and learning from execution feedback.
## Core Objectives
1. **Token Efficiency**: Achieve 86.9% reduction in adaptation latency through incremental updates
2. **Context Quality**: Improve accuracy by 10.6% through structured playbook management
3. **Self-Improvement**: Enable learning from execution feedback without labeled data
4. **Production Ready**: Deploy as MCP server with Docker support for local and remote environments
5. **LLM Flexibility**: Support both OpenAI API and local LM Studio server
## Technical Scope
### Components
- **ACE Core**: Generator, Reflector, Curator components
- **Storage Layer**: Bullet storage, semantic deduplication, embeddings
- **MCP Protocol**: Standard MCP server with 6 tools
- **Web Dashboard**: Interactive demonstration interface
- **Deployment**: Docker support for local and Ubuntu VM environments
### Technology Stack
- **Runtime**: Node.js + TypeScript
- **Protocol**: MCP (Model Context Protocol) via stdio
- **Storage**: JSON-based persistent storage
- **Embeddings**: TF-IDF for semantic similarity
- **Containerization**: Docker + Docker Compose
- **LLM Integration**: OpenAI API + LM Studio local server
## Key Requirements
1. **Dual Deployment Support**:
- Local development with Docker
- Production deployment on Ubuntu VM
2. **LLM Provider Flexibility**:
- OpenAI API integration
- LM Studio local server support (http://10.242.247.136:11888)
- Configuration-based provider switching
3. **MCP Compliance**:
- Standard JSON-RPC 2.0 over stdio
- Tool registration and execution
- Error handling and logging
4. **ACE Framework Implementation**:
- Incremental delta updates
- Grow-and-refine mechanism
- Semantic deduplication (configurable threshold)
## Success Criteria
- [x] Project structure initialized
- [ ] TypeScript source files implemented
- [ ] Docker and Docker Compose configuration
- [ ] LLM provider abstraction layer
- [ ] MCP server functional testing
- [ ] Dashboard accessibility
- [ ] Documentation complete
- [ ] Production deployment tested
## Constraints
- Must maintain MCP protocol compatibility
- Token usage optimization is critical
- Support for offline/local LLM operation
- Docker-based deployment only (no bare metal for now)
## Research Foundation
Based on ACE framework from Stanford University & SambaNova Systems:
- Paper: "Agentic Context Engineering: Evolving Contexts for Self-Improving Language Models"
- Published: October 2025
- Key Innovation: Incremental context updates vs full rewrites
## Current State
- Project structure exists with package.json
- Dashboard files present (HTML/JS/CSS)
- Source TypeScript files need to be implemented
- Memory Bank initialized
- Docker configuration pending
- LLM integration pending