Skip to main content
Glama

ACE MCP Server

projectbrief.mdโ€ข3.29 kB
# Project Brief: ACE MCP Server ## Project Overview **Project Name:** ACE MCP Server (Agentic Context Engineering Model Context Protocol Server) **Project ID:** ace-mcp-001 **Created:** 2025-10-28 **Status:** Initialization Phase ## Purpose Implementation of the ACE (Agentic Context Engineering) framework as a Model Context Protocol (MCP) server for Cursor AI and other MCP-compatible clients. The system provides self-improving context management through incremental delta updates, semantic deduplication, and learning from execution feedback. ## Core Objectives 1. **Token Efficiency**: Achieve 86.9% reduction in adaptation latency through incremental updates 2. **Context Quality**: Improve accuracy by 10.6% through structured playbook management 3. **Self-Improvement**: Enable learning from execution feedback without labeled data 4. **Production Ready**: Deploy as MCP server with Docker support for local and remote environments 5. **LLM Flexibility**: Support both OpenAI API and local LM Studio server ## Technical Scope ### Components - **ACE Core**: Generator, Reflector, Curator components - **Storage Layer**: Bullet storage, semantic deduplication, embeddings - **MCP Protocol**: Standard MCP server with 6 tools - **Web Dashboard**: Interactive demonstration interface - **Deployment**: Docker support for local and Ubuntu VM environments ### Technology Stack - **Runtime**: Node.js + TypeScript - **Protocol**: MCP (Model Context Protocol) via stdio - **Storage**: JSON-based persistent storage - **Embeddings**: TF-IDF for semantic similarity - **Containerization**: Docker + Docker Compose - **LLM Integration**: OpenAI API + LM Studio local server ## Key Requirements 1. **Dual Deployment Support**: - Local development with Docker - Production deployment on Ubuntu VM 2. **LLM Provider Flexibility**: - OpenAI API integration - LM Studio local server support (http://10.242.247.136:11888) - Configuration-based provider switching 3. **MCP Compliance**: - Standard JSON-RPC 2.0 over stdio - Tool registration and execution - Error handling and logging 4. **ACE Framework Implementation**: - Incremental delta updates - Grow-and-refine mechanism - Semantic deduplication (configurable threshold) ## Success Criteria - [x] Project structure initialized - [ ] TypeScript source files implemented - [ ] Docker and Docker Compose configuration - [ ] LLM provider abstraction layer - [ ] MCP server functional testing - [ ] Dashboard accessibility - [ ] Documentation complete - [ ] Production deployment tested ## Constraints - Must maintain MCP protocol compatibility - Token usage optimization is critical - Support for offline/local LLM operation - Docker-based deployment only (no bare metal for now) ## Research Foundation Based on ACE framework from Stanford University & SambaNova Systems: - Paper: "Agentic Context Engineering: Evolving Contexts for Self-Improving Language Models" - Published: October 2025 - Key Innovation: Incremental context updates vs full rewrites ## Current State - Project structure exists with package.json - Dashboard files present (HTML/JS/CSS) - Source TypeScript files need to be implemented - Memory Bank initialized - Docker configuration pending - LLM integration pending

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Angry-Robot-Deals/ace-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server