Integrates with OpenAI's API to power the ACE framework's Generator, Reflector, and Curator components for self-improving code generation and context learning
ACE MCP Server
Agentic Context Engineering (ACE) - Self-improving AI context framework with Model Context Protocol (MCP) integration for Cursor AI.
🎯 Overview
ACE MCP Server is an intelligent development assistant that learns from your coding patterns and automatically enhances your development workflow. It integrates seamlessly with Cursor AI through the Model Context Protocol (MCP), providing contextual code generation, intelligent analysis, and self-improving recommendations.
✨ Key Features
🤖 Smart Code Generation - Context-aware code generation with automatic prompt enhancement
🧠 Intelligent Code Analysis - Deep code analysis with actionable improvement suggestions
📚 Self-Improving Playbook - Accumulates knowledge and patterns from your development work
🔧 Multiple LLM Support - Works with OpenAI, Anthropic Claude, DeepSeek, Google Gemini, Mistral, and LM Studio
🐳 Docker Ready - Complete containerized solution for local and production deployment
🔒 Secure by Default - Bearer token authentication and comprehensive security measures
🚀 What Makes ACE Special
ACE doesn't just generate code - it learns from your development patterns and improves over time:
Generates contextual development trajectories
Reflects on code to extract insights and patterns
Curates knowledge into a self-improving playbook
Enhances future interactions with accumulated wisdom
📚 Documentation
🚀 Getting Started
Installation Guide - Complete setup instructions
Project Overview - Detailed project introduction
Quick Start - Fast track to running ACE
⚙️ Setup & Configuration
Cursor AI Setup - Basic MCP integration
Enhanced Auto Setup - Smart auto-enhancement features
LLM Providers - Configure different AI providers
🚀 Deployment
Production Deployment - Deploy to production servers
Full Deployment Guide - Complete Docker deployment guide
📖 Project Documentation
Project Status - Current development status
Architecture - Technical architecture details
GitHub Setup - Repository initialization
⚡ Quick Start
1. Clone and Setup
2. Docker Development
3. Configure Cursor AI
See detailed setup instructions:
Basic Cursor AI Setup - Initialize your MCP server with basic ACE tools
Enhanced Auto Setup - Automatically enhance prompts and invoke appropriate ACE methods
4. Use ACE Commands
5. View Playbook
The ACE playbook stores accumulated knowledge and patterns from your development work. View it using:
Option 1: Via API endpoint (JSON)
Option 2: Using provided script
Option 3: Via MCP tool in Cursor AI
Option 4: Via dashboard
The playbook contains:
Patterns - Code patterns and conventions learned from your work
Best Practices - Development best practices accumulated over time
Insights - Key insights extracted from code reflections
🛠️ Development
Prerequisites
Node.js 18+
Docker & Docker Compose
TypeScript
Local Development
Docker Management
🔧 Configuration
LLM Providers & Models
ACE supports 6 LLM providers with various models:
Supported Providers
DeepSeek (Recommended) ⭐
Provider:
deepseekDefault Model:
deepseek-chat(V3.2-Exp)Embedding Model:
deepseek-embeddingBest for: ACE framework performance, cost-effective
Pricing: $0.28/1M input, $0.42/1M output tokens
Context: 128K tokens, Max output: 32K (reasoner mode)
Environment Variables:
LLM_PROVIDER=deepseek DEEPSEEK_API_KEY=sk-your-deepseek-api-key DEEPSEEK_MODEL=deepseek-chat DEEPSEEK_EMBEDDING_MODEL=deepseek-embedding
OpenAI
Provider:
openaiModels:
gpt-4o,gpt-4,gpt-3.5-turboEmbedding Models:
text-embedding-3-small,text-embedding-3-largeEnvironment Variables:
LLM_PROVIDER=openai OPENAI_API_KEY=sk-your-openai-api-key OPENAI_MODEL=gpt-4o OPENAI_EMBEDDING_MODEL=text-embedding-3-small
Anthropic Claude
Provider:
anthropicModels:
claude-3-5-sonnet-20241022,claude-3-opus,claude-3-sonnet,claude-3-haikuEnvironment Variables:
LLM_PROVIDER=anthropic ANTHROPIC_API_KEY=sk-ant-your-api-key ANTHROPIC_MODEL=claude-3-5-sonnet-20241022
Google Gemini
Provider:
geminiModels:
gemini-1.5-pro,gemini-1.5-flash,gemini-proEnvironment Variables:
LLM_PROVIDER=gemini GOOGLE_API_KEY=your-google-api-key GOOGLE_MODEL=gemini-1.5-pro
Mistral
Provider:
mistralModels:
mistral-large-latest,mistral-medium-latest,mistral-small-latestEnvironment Variables:
LLM_PROVIDER=mistral MISTRAL_API_KEY=your-mistral-api-key MISTRAL_MODEL=mistral-large-latest
LM Studio (Local/Self-hosted)
Provider:
lmstudioModels: Any local model compatible with OpenAI API format
Environment Variables:
LLM_PROVIDER=lmstudio LMSTUDIO_BASE_URL=http://localhost:1234/v1 LMSTUDIO_MODEL=local-model
Environment Variables
Copy .env.example to .env and configure:
For complete configuration options, see .env.example file.
🤝 Contributing
Read the Documentation - Start with Project Overview
Follow Best Practices - Review Development Guide
Submit PRs - Follow our contribution guidelines
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
🔗 Links
ACE MCP Server - Making AI development smarter, one interaction at a time. 🚀
This server cannot be installed