iai-mcp
iai-mcp
Independent Autistic Intelligence — a local memory layer for Claude (and other MCP-compatible assistants).
About the name
IAI stands for Independent Autistic Intelligence.
Independent. Fully local. The daemon runs on your machine, embeddings are computed locally, no telemetry, no cloud dependency. Your memory is your data and stays your data.
Autistic. Describes the memory style, not a diagnosis or a metaphor. The memory is built around verbatim recall, attention to specific cues, and refusal to smooth rare events into typical ones. Most memory systems compress and summarize aggressively, aiming to give the assistant a gist of the past. This one preserves what was actually said and surfaces it on a precise cue. The trade-off is intentional: more storage and a stricter retrieval interface, in exchange for not losing details.
Intelligence. Used in the systems sense, something that observes, adapts, and stays viable over time, not the marketing sense.
What it is
A local server that speaks the MCP protocol and gives Claude, and any other MCP-compatible assistant, a long-term memory. It captures every turn of every session verbatim, organizes those captures over time into a personal map of who you are, and serves a small slice of relevant memory back at the start of each new conversation. You never have to say "remember this" or "what did we say last time?".
I built this for myself. It worked. I've been running it daily for months, and now I'm sharing it. The benchmarks were mostly for my own curiosity. I wanted to know if it actually works or if I'd just gotten used to it.
Usage
You do not call iai-mcp directly during a session. Once it's connected:
Capture is automatic. Every turn, yours and the assistant's, is recorded verbatim with timestamps and session metadata. You don't say "remember this."
Recall is automatic. When a new session starts, the daemon assembles a small relevant slice of your history and injects it into the conversation prefix. You don't say "what did we say."
Consolidation runs idle. Between sessions, the daemon merges duplicates, strengthens recall pathways for things retrieved often, and prunes weak edges. The system gets quietly better at remembering you over time.
After a few weeks of regular use the difference becomes noticeable. The assistant stops asking the same orientation questions, references things you mentioned in passing, and adapts to your style without being told.
How it works
The daemon is a Python process that runs in the background. Your MCP client connects to it via a Unix socket. No network exposure.
Memory is stored in three tiers:
Episodic is verbatim, timestamped fragments of what was said. Write-once, never overwritten or rewritten.
Semantic is summaries induced from clusters of related episodes during idle-time consolidation.
Procedural is a small set of stable parameters about you, learned over time: preferences, style cues, recurring patterns. Eleven sealed knobs that shift based on what works.
A background pass runs periodically (sleep cycles): it clusters episodes, builds semantic summaries, decays old unreinforced connections, and reinforces frequently co-retrieved paths. Things you haven't revisited fade naturally. There's an optional "insight of the day" step that makes one Anthropic API call, but it's off by default.
Recall combines three signals: semantic similarity, graph-link strength, and recency. All ranked together.
All records are encrypted at rest with AES-256-GCM. The key lives in ~/.iai-mcp/.key (mode 0600). Back it up. Lose the key, lose the memories.
Everything lives at ~/.iai-mcp/. Embeddings are computed locally with bge-small-en-v1.5. The only data that leaves the machine is your normal conversation with whatever LLM API your client uses.
Claude Code <--MCP-stdio--> TypeScript wrapper <--UNIX socket--> Python daemon <--> LanceDBQuick start
Prerequisites
macOS or Linux (Apple Silicon and x86_64 tested)
Python 3.11 or 3.12
Node.js 18+
Claude Code as the MCP host
~500 MB free disk
Windows not supported. WSL2 untested.
Install
git clone https://github.com/CodeAbra/iai-mcp.git
cd iai-mcp
bash scripts/install.shThe installer creates a Python venv, installs dependencies (LanceDB, sentence-transformers, torch-hd, NetworkX, igraph), builds the TypeScript MCP wrapper, pre-downloads the default embedding model (~130 MB), symlinks the CLI to ~/.local/bin/iai-mcp, and on macOS registers the daemon with launchd.
Make sure ~/.local/bin is on your PATH:
export PATH="$HOME/.local/bin:$PATH" # add to ~/.zshrc or ~/.bashrc
iai-mcp --versionOn Linux, install the systemd unit manually:
mkdir -p ~/.config/systemd/user
cp deploy/systemd/iai-mcp-daemon.service ~/.config/systemd/user/
systemctl --user daemon-reload
systemctl --user enable iai-mcp-daemon
systemctl --user start iai-mcp-daemonInstall the Stop hook
This is what makes capture ambient. Without it you'd have to save memories by hand.
mkdir -p ~/.claude/hooks
cp deploy/hooks/iai-mcp-session-capture.sh ~/.claude/hooks/
chmod +x ~/.claude/hooks/iai-mcp-session-capture.shRegister in ~/.claude/settings.json:
{
"hooks": {
"Stop": [
{
"matcher": "*",
"hooks": [
{
"type": "command",
"command": "$HOME/.claude/hooks/iai-mcp-session-capture.sh"
}
]
}
]
}
}Connect Claude
claude mcp add iai-mcp -- node "$(pwd)/mcp-wrapper/dist/index.js"Or edit ~/.claude.json directly:
{
"mcpServers": {
"iai-mcp": {
"command": "node",
"args": ["/absolute/path/to/iai-mcp/mcp-wrapper/dist/index.js"]
}
}
}Use the absolute path. ~ and $HOME won't expand here.
For Claude Desktop (untested), edit ~/Library/Application Support/Claude/claude_desktop_config.json on macOS or the equivalent path on Linux.
Verify
iai-mcp doctor
iai-mcp daemon statusRestart Claude Code. Start a session, do some work, exit. Then:
tail ~/.iai-mcp/logs/capture-$(date -u +%Y-%m-%d).logYou should see a rc=0 line. That's your first memory.
Doctor
iai-mcp doctor runs 14 checks against the daemon, the store, and the runtime state. Output is one line per check: PASS, WARN, or FAIL.
iai-mcp doctorWhat it checks:
# | Check | What it means |
a | Daemon alive | Is the daemon process running? |
b | Socket fresh | Can the UNIX socket accept a connection? |
c | Lock healthy | Is the process lock held correctly? |
d | No orphan core | No leftover stdio core process without a daemon |
e | State file valid |
|
f | LanceDB readable | Can the records table be opened and queried? |
g | No duplicate binders | Only one process is bound to the socket |
h | Crypto file state | Encryption key exists, correct permissions (0600) |
i | Lance versions count | LanceDB version manifests aren't piling up |
j | Lifecycle current state | Current FSM state is valid |
k | Lifecycle history 24h | Recent lifecycle transitions look sane |
l | Sleep cycle status | Last sleep cycle completed or is running normally |
m | Heartbeat scanner | Wrapper heartbeat files are fresh |
n | HID idle source | Idle detection source is available |
14/14 PASS is healthy. 13/14 with check (b) failing during a sleep cycle is also normal (the socket is busy during consolidation). Multiple FAILs or a FAIL on (a) or (f) means something is actually wrong.
Notes for AI assistants helping with installation
If you're an AI model helping a user install or debug iai-mcp, read this before diagnosing anything.
The daemon has four FSM states: WAKE, TRANSITIONING, SLEEP, DREAMING. Separately, the wrapper-layer lifecycle has four states: WAKE, DROWSY, SLEEP, HIBERNATION. These are two different state machines in two different files. Don't confuse them.
Things that look broken but are not:
last_tick_athasn't changed in hours. This is a sleep-cycle tick timestamp, not a process heartbeat. The daemon can be alive and healthy with a stalelast_tick_at. Checkiai-mcp daemon statusinstead.fsm_state: DREAMINGfor hours. Normal. The daemon is running a sleep cycle. Wake transitions are event-driven (a new session arriving wakes it), not time-driven.status: deferredin the capture log. This is the only path the Stop hook takes. It writes captures to~/.iai-mcp/.deferred-captures/and the daemon picks them up on its next tick. This is by design, not a failure.iai-mcp doctorshows 13/14 PASS with check (b) "socket unreachable." Expected when the daemon FSM is in SLEEP or DREAMING. The socket is intentionally unresponsive during consolidation passes. This is healthy.Cold start takes a few seconds. The daemon needs to initialize the LanceDB store and load the embedding model into memory. Don't kill it because it hasn't responded in 5 seconds.
Actual problems look like: iai-mcp daemon status returning ok: False, doctor showing multiple FAIL results, or the daemon process not existing at all (pgrep -f iai_mcp.daemon returns nothing).
When in doubt, run iai-mcp doctor and read what it says. The output is self-explanatory.
Benchmarks
I made these because I wanted honest numbers. Every harness ships in bench/. Run them on your machine, get your own results.
Metric | Target | Measured |
Verbatim recall (byte-exact) | >=99% | >=99% at N=10k |
Recall p95 latency | <100 ms | <100 ms at N=10k |
RAM at steady state | <=300 MB | ~150-300 MB |
Session-start tokens (warm cache) | <=3,000 | <=3,000 |
Session-start tokens (cold) | <=8,000 | <=8,000 |
python -m bench.verbatim # verbatim fidelity
python -m bench.neural_map # recall latency
python -m bench.memory_footprint # RAM usage
python -m bench.tokens # session-start cost
python -m bench.total_session_cost # full 10-turn cost
python -m bench.trajectory # 30-session corpus
python -m bench.contradiction_longitudinal # falsifiability
python -m bench.longmemeval_blind # LongMemEval-S blind runThe LongMemEval-S run is blind on purpose. No dataset-specific tuning, no hyperparameter sweep. The numbers are what they are.
Configuration
Variable | Default | What it does |
|
| Data directory |
|
| Embedding model. |
Switching embedders requires re-embedding the store: iai-mcp migrate reembed.
Status and limitations
This is experimental. I built it for myself, it works on my machine, and I'm sharing it because it might be useful to you. No SLA, no support guarantee. Breaking changes are possible between versions. Pin a commit hash if you depend on stability.
Limitations worth knowing about:
The default embedding model is English-only. The assistant translates to English on the way into memory. The opt-in
bge-m3model removes this constraint at a cost of ~3x storage and slower indexing.No cross-machine sync. The data lives where the daemon runs. Backup is
cp -a ~/.iai-mcp/somewhere safe.No GUI. Inspection happens through CLI subcommands (
iai-mcp doctor,iai-mcp daemon status,iai-mcp topology).Cold start on a freshly booted machine takes a few seconds while the daemon initializes caches.
Recall quality on the first ~10 sessions is mediocre. The system needs material to consolidate before it gets useful.
Compatibility
Claude Code is the primary host, validated in daily use.
Claude Desktop should work (uses claude_desktop_config.json instead of ~/.claude.json) but hasn't been tested end to end.
Other MCP-over-stdio hosts speak the same protocol and should work in principle. Not tested.
If you get it running on something else, open an issue or PR.
Authors
By Areg Aramovich Noya, in collaboration with the team at lcgc.dev.
I built this because I needed it. It works for me. If it works for you, take it.
License
Contributing
Issues and PRs welcome. If your change touches retrieval, capture, or consolidation, include bench re-runs.
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/CodeAbra/iai-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server