Linksee Memory
Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| LINKSEE_TELEMETRY | No | Opt-in telemetry setting: 'basic' to enable, 'off' to disable (default) | off |
| LINKSEE_MEMORY_DIR | No | Override the default database directory location | ~/.linksee-memory |
Capabilities
Features and capabilities supported by this server
| Capability | Details |
|---|---|
| tools | {} |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
| rememberA | Store a memory about an entity (person/company/project/concept/file) in one of 6 layers: goal (WHY), context (WHY-THIS-NOW), emotion (USER tone), implementation (HOW — success/failure), caveat (PAIN lesson, never forgotten), learning (GROWTH log). Use this when you discover non-obvious goals, unexpected failures, user preferences, or decisions worth preserving. Pasted assistant output or CI logs are rejected (use force=true only if you are sure). |
| recallA | Retrieve memories relevant to the current context using full-text search (BM25) + entity-name match, re-ranked by a composite score (relevance × heat × momentum × importance). Returns only what fits in the token budget, with match_reasons explaining WHY each memory was returned. Opportunistically refreshes stale momentum scores for entities in the result set. Supports pagination via offset/has_more. Layer aliases accepted. Use at the start of any task that might involve prior work. |
| update_memoryA | Atomically edit an existing memory in-place. Preferred over forget+remember because it preserves memory_id, which matters for session_file_edits links and referential integrity. Use to correct facts, update deadlines in goal entries, refine caveats, or re-score importance. Caveat-layer memories can be updated but cannot have their protected flag removed. |
| list_entitiesA | List the entities currently known to this memory store, sorted by recent activity. Use at the start of a new session ("what do I know about?") before issuing specific recall queries. Cheaper than recall for the "give me an overview" question. |
| forgetA | Explicitly delete a memory by id, OR run auto-forgetting across all memories based on forgettingRisk (importance + heat + age). Caveat-layer, goal-layer, and pinned (importance>=0.9) memories are always preserved. Prefer update_memory for corrections — forget is destructive. |
| consolidateA | Sleep-mode compression. Clusters cold low-importance memories by (entity, layer), summarizes each cluster into a single protected learning-layer entry, deletes originals, and runs a forget-sweep. Run at session end or on demand. Set dry_run=true to preview without writing. |
| recall_fileA | Get the COMPLETE edit history of a file across all sessions, with per-edit user-intent context. Returns: total edit count, daily breakdown, list of distinct user intents that drove the edits, and the linked memories. Use this when you need to understand WHY a file was modified historically — far more accurate than recall() for file-centric questions because it queries session_file_edits (every physical edit) instead of summary memories. |
| read_smartA | Read a file with diff-only caching. Returns: (1) full content + chunk metadata on first read, (2) "unchanged" + cached chunk list (~50 tokens) if mtime matches, (3) "unchanged_content" if mtime changed but sha256 matches (touched but not modified), (4) changed chunks with content + unchanged chunks as metadata-only if the file was truly modified. Use INSTEAD of Read for files you have read before — saves 50%+ tokens on re-reads. |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/michielinksee/linksee-memory'
If you have feedback or need assistance with the MCP directory API, please join our Discord server