Import memories from JSON
memory_importImport memories, decisions, and pitfalls from a JSON file produced by memory_export. Resolve conflicts with skip (preserve) or overwrite (replace) strategy. Embeddings are processed asynchronously.
Instructions
Import memories, decisions, and pitfalls from a JSON file produced by memory_export (server-local path, not a URL). Conflict handling via strategy: skip (default, safe to re-run) keeps existing rows on id collision; overwrite replaces them — use for authoritative restores. Side effects: inserts/updates rows in memories, decisions, pitfalls. Embeddings are queued asynchronously.
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| path | Yes | Absolute path on the server's filesystem to a JSON file produced by `memory_export` (e.g. `/tmp/memento-backup.json`). | |
| strategy | No | `skip` (default) preserves existing rows on id collision; `overwrite` replaces them. | skip |
Output Schema
| Name | Required | Description | Default |
|---|---|---|---|
| message | Yes | Summary line with counts of rows inserted / updated / skipped per table (memories, decisions, pitfalls). |
Implementation Reference
- src/tools/memory-transfer.ts:74-169 (handler)The main handler function `handleMemoryImport` that reads a JSON file from the filesystem, validates schema version, and imports memories into the database with a configurable skip/overwrite strategy.
export async function handleMemoryImport( db: Database.Database, params: { path: string; strategy?: "skip" | "overwrite" }, ): Promise<string> { const strategy = params.strategy ?? "skip"; let raw: string; try { raw = readFileSync(params.path, "utf-8"); } catch (e) { return `Failed to read import file: ${e instanceof Error ? e.message : String(e)}`; } let payload: ExportPayload; try { payload = JSON.parse(raw); } catch (e) { return `Invalid JSON: ${e instanceof Error ? e.message : String(e)}`; } if (!SUPPORTED_SCHEMA_VERSIONS.has(payload.schema_version)) { return `Unsupported schema_version ${payload.schema_version}. This build accepts: ${[...SUPPORTED_SCHEMA_VERSIONS].join(", ")}.`; } let imported = 0; let skipped = 0; let overwritten = 0; const tx = db.transaction(() => { const selectMem = db.prepare("SELECT id FROM memories WHERE id = ?"); const insertMem = db.prepare(` INSERT INTO memories (id, project_id, memory_type, scope, title, body, tags, importance_score, confidence_score, access_count, last_accessed_at, is_pinned, supersedes_memory_id, source, adaptive_score, created_at, updated_at, deleted_at) VALUES (@id, @project_id, @memory_type, @scope, @title, @body, @tags, @importance_score, @confidence_score, @access_count, @last_accessed_at, @is_pinned, @supersedes_memory_id, @source, @adaptive_score, @created_at, @updated_at, @deleted_at) `); const updateMem = db.prepare(` UPDATE memories SET project_id = @project_id, memory_type = @memory_type, scope = @scope, title = @title, body = @body, tags = @tags, importance_score = @importance_score, confidence_score = @confidence_score, access_count = @access_count, last_accessed_at = @last_accessed_at, is_pinned = @is_pinned, supersedes_memory_id = @supersedes_memory_id, source = @source, adaptive_score = @adaptive_score, updated_at = @updated_at, deleted_at = @deleted_at WHERE id = @id `); for (const mem of payload.memories ?? []) { const row = { id: mem.id, project_id: mem.project_id ?? null, memory_type: mem.memory_type ?? "fact", scope: mem.scope ?? "project", title: mem.title, body: mem.body ?? "", tags: mem.tags ?? null, importance_score: mem.importance_score ?? 0.5, confidence_score: mem.confidence_score ?? 1.0, access_count: mem.access_count ?? 0, last_accessed_at: mem.last_accessed_at ?? null, is_pinned: mem.is_pinned ?? 0, supersedes_memory_id: mem.supersedes_memory_id ?? null, source: mem.source ?? "user", adaptive_score: mem.adaptive_score ?? 0.5, created_at: mem.created_at ?? new Date().toISOString(), updated_at: mem.updated_at ?? new Date().toISOString(), deleted_at: mem.deleted_at ?? null, }; const existing = selectMem.get(row.id); if (existing) { if (strategy === "overwrite") { updateMem.run(row); overwritten++; } else { skipped++; } } else { insertMem.run(row); imported++; } } }); tx(); return ( `Import complete: ${imported} imported, ${skipped} skipped, ${overwritten} overwritten. ` + `Strategy=${strategy}.` ); } - src/tools/memory-transfer.ts:7-14 (schema)The `ExportPayload` interface defines the shape of the JSON file that `handleMemoryImport` expects (schema_version, exported_at, projects, memories, decisions, pitfalls).
interface ExportPayload { schema_version: number; exported_at: string; projects: any[]; memories: any[]; decisions: any[]; pitfalls: any[]; } - src/index.ts:647-672 (registration)Registers the `memory_import` tool with the MCP server, including title, description, inputSchema (path and strategy), annotations, outputSchema, and the handler callback that calls handleMemoryImport.
server.registerTool( "memory_import", { title: "Import memories from JSON", description: [ "Import memories, decisions, and pitfalls from a JSON file produced by `memory_export` (server-local path, not a URL).", "Conflict handling via `strategy`: `skip` (default, safe to re-run) keeps existing rows on id collision; `overwrite` replaces them — use for authoritative restores.", "Side effects: inserts/updates rows in `memories`, `decisions`, `pitfalls`. Embeddings are queued asynchronously.", ].join(" "), inputSchema: { path: z.string().min(1).describe("Absolute path on the server's filesystem to a JSON file produced by `memory_export` (e.g. `/tmp/memento-backup.json`)."), strategy: z.enum(["skip", "overwrite"]).default("skip").describe("`skip` (default) preserves existing rows on id collision; `overwrite` replaces them."), }, annotations: { title: "Import memories from JSON", readOnlyHint: false, destructiveHint: false, idempotentHint: false, openWorldHint: false, }, outputSchema: { message: z.string().describe("Summary line with counts of rows inserted / updated / skipped per table (memories, decisions, pitfalls)."), }, }, async (params) => textResult(await handleMemoryImport(db, params)) ); - src/tools/memory-transfer.ts:1-5 (helper)Imports used by `handleMemoryImport`: Database type from better-sqlite3 and readFileSync from fs for reading the JSON file.
import type Database from "better-sqlite3"; import { readFileSync } from "node:fs"; const SUPPORTED_SCHEMA_VERSIONS = new Set([2]); const CURRENT_SCHEMA_VERSION = 2;