Provides persistent project context and memory support for the Windsurf editor, enabling the assistant to maintain long-term knowledge of patterns and decisions within the Codeium-based development environment.
CogmemAi — Cognitive Memory for Ai Coding Assistants
Your Ai coding assistant forgets everything between sessions. CogmemAi fixes that.
One command. Your assistant remembers your architecture, patterns, decisions, bugs, and preferences — permanently. Works with Claude Code, Cursor, Windsurf, Cline, Continue, and any MCP-compatible tool. Switch editors, switch models, switch machines — your knowledge stays.
What's New in v3
Memory health score — see how healthy your memory system is at a glance with a 0-100 score and actionable factors
Session replay — pick up exactly where you left off with automatic session summaries loaded at startup
Self-tuning memory — memories automatically adjust importance based on real usage; stale memories auto-archive
Auto-ingest README — when you start a new project, CogmemAi offers to learn from your README instantly
Smart recall — relevant memories surface automatically as you switch topics mid-session
Auto-learning — CogmemAi learns from your sessions automatically, no manual saving needed
Task tracking — save tasks that persist across sessions with status and priority
Correction learning — teach your assistant what went wrong and what's right, so mistakes aren't repeated
Session reminders — set nudges that surface automatically at the start of your next session
Stale memory detection — find outdated memories that need review or cleanup
File change awareness — see what files changed since your last session
Memory consolidation — merge related memories into comprehensive summaries using Ai
29 tools — the most complete memory toolkit for Ai coding assistants
Quick Start
npx cogmemai-mcp setupThat's it. The setup wizard verifies your API key, configures Claude Code, installs automatic context recovery, and you're ready. Start Claude Code by typing claude and your memories are ready.
Don't have an API key yet? Get one free at hifriendbot.com/developer.
The Problem
Every time you start a new session, you lose context. You re-explain your tech stack, your architecture decisions, your coding preferences. Built-in memory in tools like Claude Code is a flat file with no search, no structure, and no intelligence.
CogmemAi gives your Ai assistant a real memory system:
Semantic search — finds relevant memories by meaning, not keywords
Ai-powered extraction — automatically identifies facts worth remembering from your conversations
Smart deduplication — detects duplicate and conflicting memories automatically
Privacy controls — auto-detects API keys, tokens, and secrets before storing
Document ingestion — feed in READMEs and docs to instantly build project context
Project scoping — memories tied to specific repos, plus global preferences that follow you everywhere
Smart context — intelligently ranked for maximum relevance to your current work
Compaction recovery — survives Claude Code context compaction automatically
Token-efficient — compact context loading that won't bloat your conversation
Zero setup — no databases, no Docker, no Python, no vector stores
Why Cloud Memory?
Local memory solutions come with maintenance overhead: database management, version conflicts, storage growth, and setup complexity. CogmemAi runs extraction and search server-side. Your MCP server is a thin HTTP client — zero local databases, zero RAM issues, zero maintenance. All memories are encrypted at rest, so your data is just as secure as a local database — with cloud portability and team features on top.
Your memory follows you everywhere. Memories created in Claude Code are instantly available in Cursor, Windsurf, Cline, and any MCP-compatible tool. Switch between Opus, Sonnet, Haiku, or any model your editor supports — your memories persist regardless. New laptop? New OS? Log in and your full project knowledge is waiting. A local SQLite file dies with your machine. Cloud memory is permanent.
Teams and collaboration. Cloud memory is the only way to share project knowledge across teammates. When one developer saves an architecture decision or documents a bug fix, every team member's Ai assistant knows about it instantly. No syncing, no merge conflicts, no stale local databases. Whether it's two developers or twenty, everyone's assistant has the same up-to-date context. This is impossible with local-only memory solutions.
Compaction Recovery
When your Ai assistant compacts your context, conversation history gets compressed and context is lost. CogmemAi handles this automatically — your context is preserved before compaction and seamlessly restored afterward. No re-explaining, no manual prompting.
The npx cogmemai-mcp setup command configures everything automatically.
Skill
CogmemAi includes a Claude Skill that teaches Claude best practices for memory management — when to save, importance scoring, memory types, and session workflows.
Claude Code:
/skill install https://github.com/hifriendbot/cogmemai-mcp/tree/main/skill/cogmemai-memoryClaude.ai: Upload the skill/cogmemai-memory folder in Settings > Skills.
Claude API: Use the Skills API to attach the skill to your requests.
CLI Commands
npx cogmemai-mcp setup # Interactive setup wizard
npx cogmemai-mcp setup <key> # Setup with API key
npx cogmemai-mcp verify # Test connection and show usage
npx cogmemai-mcp --version # Show installed version
npx cogmemai-mcp help # Show all commandsManual Setup
If you prefer to configure manually instead of using npx cogmemai-mcp setup:
Option A — Per project (add .mcp.json to your project root):
{
"mcpServers": {
"cogmemai": {
"command": "cogmemai-mcp",
"env": {
"COGMEMAI_API_KEY": "cm_your_api_key_here"
}
}
}
}Option B — Global (available in every project):
claude mcp add cogmemai cogmemai-mcp -e COGMEMAI_API_KEY=cm_your_api_key_here --scope userWorks With
Claude Code (Recommended)
Automatic setup:
npx cogmemai-mcp setupCursor
Add to ~/.cursor/mcp.json:
{
"mcpServers": {
"cogmemai": {
"command": "npx",
"args": ["-y", "cogmemai-mcp"],
"env": { "COGMEMAI_API_KEY": "cm_your_api_key_here" }
}
}
}Windsurf
Add to ~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"cogmemai": {
"command": "npx",
"args": ["-y", "cogmemai-mcp"],
"env": { "COGMEMAI_API_KEY": "cm_your_api_key_here" }
}
}
}Cline (VS Code)
Open VS Code Settings > Cline > MCP Servers, add:
{
"cogmemai": {
"command": "npx",
"args": ["-y", "cogmemai-mcp"],
"env": { "COGMEMAI_API_KEY": "cm_your_api_key_here" }
}
}Continue
Add to ~/.continue/config.yaml:
mcpServers:
- name: cogmemai
command: npx
args: ["-y", "cogmemai-mcp"]
env:
COGMEMAI_API_KEY: cm_your_api_key_hereCogmemUI Cockpit
CogmemUI Cockpit is a free multi-model Ai workspace with built-in CogmemAi memory. Add your CogmemAi API key in Settings > API Keys and your memory is instantly available. CogmemUI also supports connecting any MCP-compatible tool server via Settings > MCP Servers — add endpoints, auto-discover tools, and use them in chat.
Get your free API key at hifriendbot.com/developer.
Tools
CogmemAi provides 29 tools that your Ai assistant uses automatically:
Tool | Description |
| Store a fact explicitly (architecture decision, preference, etc.) |
| Search memories using natural language (semantic search) |
| Ai extracts facts from a conversation exchange automatically |
| Load top memories at session start (with smart ranking, health score, and session replay) |
| Browse memories with filters (paginated, with untyped filter) |
| Update content, importance, scope, type, category, subject, and tags |
| Permanently delete a memory |
| Delete up to 100 memories at once |
| Update up to 50 memories at once (content, type, category, tags, etc.) |
| Check your usage stats and tier info |
| Export all memories as JSON for backup or transfer |
| Bulk import memories from a JSON array |
| Feed in a document (README, API docs) to auto-extract memories |
| Save a summary of what was accomplished in this session |
| View all tags in use across your memories |
| Connect related memories with named relationships |
| Explore the knowledge graph around a memory |
| View edit history of a memory |
| Memory health dashboard with self-tuning insights (filterable by project) |
| Promote a project memory to global scope |
| Merge related memories into comprehensive summaries using Ai |
| Create a persistent task with status and priority tracking |
| Retrieve tasks for the current project — pick up where you left off |
| Change task status, priority, or description as you work |
| Store a "wrong approach → right approach" pattern to avoid repeated mistakes |
| Set a reminder that surfaces at the start of your next session |
| Find memories that may be outdated for review or cleanup |
| See what files changed since your last session |
| Signal whether a recalled memory was useful or irrelevant to improve future recall |
SDKs
Build your own integrations with the CogmemAi API:
Memory Types
Memories are categorized for better organization and retrieval:
identity — Who you are, your role, team
preference — Coding style, tool choices, conventions
architecture — System design, tech stack, file structure
decision — Why you chose X over Y
bug — Known issues, fixes, workarounds
dependency — Version constraints, package notes
pattern — Reusable patterns, conventions
context — General project context
task — Persistent tasks with status and priority tracking
correction — Wrong approach → right approach patterns
reminder — Next-session nudges that auto-expire
Scoping
Project memories — Architecture, decisions, bugs specific to one repo. Auto-detected from your repository.
Global memories — Your coding preferences, identity, tool choices. Available in every project.
Pricing
Free | Pro | Team | Enterprise | |
Price | $0 | $14.99/mo | $39.99/mo | $99.99/mo |
Memories | 500 | 2,000 | 10,000 | 50,000 |
Extractions/mo | 500 | 2,000 | 5,000 | 20,000 |
Projects | 5 | 20 | 50 | 200 |
Start free. Upgrade when you need more. Or pay per operation with USDC on-chain — no credit card required.
Privacy & Security
Encryption at rest. All memories are encrypted before they touch the database. Even in a data breach, your data is unreadable.
No source code leaves your machine. We store extracted facts (short sentences), never raw code.
API keys cryptographically hashed (irreversible) server-side.
All traffic over HTTPS.
No model training on your data. Ever.
Delete everything instantly via dashboard or MCP tool.
No cross-user data sharing.
Read our full privacy policy.
Environment Variables
Variable | Required | Description |
| Yes | Your API key (starts with |
| No | Custom API URL (default: hifriendbot.com) |
Support
Issues: GitHub Issues
License
MIT — see LICENSE
Built by HiFriendbot — Better Friends, Better Memories, Better Ai.
Resources
Looking for Admin?
Admins can modify the Dockerfile, update the server description, and track usage metrics. If you are the server author, to access the admin panel.