@remember-md/mcp
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@@remember-md/mcpsearch my brain for ideas on AI ethics"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
@remember-md/mcp
Local MCP server for the Remember.md second brain. Run via npx, point any MCP client at it, query your markdown brain semantically.
Status: v0.0.1 — skeleton. Not yet functional. Active development.
What it does
Exposes your local markdown brain (a folder of .md files organised PARA-style by the Remember.md plugin) as a set of MCP tools any MCP client can call — Claude Code, OpenClaw, Cursor, Codex CLI, Claude.ai web, ChatGPT custom GPTs, anything that speaks the Model Context Protocol.
Planned tools:
search_brain(query, top_k)— semantic + BM25 hybrid + wikilink-expandget_file(path)— read a brain filelist_recent(period, kind?)— recent journal / notes / decisionsquery_persona()— currentPersona.mdcontentdashboard_snapshot()— counts + top beliefs + active projectspropose_belief(claim, evidence)— write candidate toInbox/
How it works
Storage:
node:sqlite(Node 22.5+ stdlib) + sqlite-vec extension for vector search + FTS5 for BM25 — no server, no native compilation, no toolchain.Embeddings: @huggingface/transformers running quantized
Xenova/bge-micro-v2(384d, ~17 MB) locally — no cloud calls.Sync: on-demand mtime + content-hash incremental reindex at query time. The brain (markdown) is the source of truth; the index in
.remember/index.dbis rebuildable.Graceful degradation: if vector loads fail, falls back to FTS5-only; if both fail, falls back to ripgrep.
Install
You don't install it. Point your MCP client at it via npx:
Claude Code (via the Remember.md plugin's /remember:init)
The Remember.md plugin automatically configures Claude Code's MCP layer to launch this server. Just run /remember:init.
Cursor / Codex / other MCP clients
Add to your MCP config:
{
"mcpServers": {
"remember": {
"command": "npx",
"args": ["-y", "@remember-md/mcp"],
"env": {
"REMEMBER_BRAIN_PATH": "/absolute/path/to/your/brain"
}
}
}
}First run downloads the package (~15–30s) and the embedding model (~17 MB, one-time). After that, queries are sub-second.
Configuration
Env var | Default | Purpose |
|
| Brain root directory (folder of markdown files) |
|
| Where the SQLite index lives |
|
| Hugging Face model id |
| auto |
|
Privacy
Local-only. No cloud calls. No telemetry. The brain folder + index never leave your machine. Embedding model runs in-process via ONNX Runtime.
License
MIT — see LICENSE.
Related
Remember.md plugin — the capture / curate / persona side that produces the brain this server queries
Remember.md spec — the markdown standard
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/remember-md/mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server