Skip to main content
Glama

Vector Memory MCP Server

Semantic memory storage for AI assistants. Store decisions, patterns, and context that persists across sessions.

A local-first MCP server that provides vector-based memory storage. Uses local embeddings and LanceDB for fast, private semantic search.

License: MIT npm version


Features

  • Local & Private - All embeddings generated locally, data stored in local LanceDB

  • Semantic Search - Vector similarity search with configurable scoring

  • Batch Operations - Store, update, delete, and retrieve multiple memories at once

  • Session Handoffs - Save and restore project context between sessions

  • MCP Native - Standard protocol, works with any MCP-compatible client


Quick Start

Prerequisites

  • Bun 1.0+

  • An MCP-compatible client (Claude Code, Claude Desktop, etc.)

Install

bun install -g @aeriondyseti/vector-memory-mcp

First install downloads ML models (~90MB). This may take a minute.

Configure

Add to your MCP client config (e.g., ~/.claude/settings.json):

{ "mcpServers": { "vector-memory": { "type": "stdio", "command": "bunx", "args": ["--bun", "@aeriondyseti/vector-memory-mcp"] } } }

Use

Restart your MCP client. You now have access to:

Tool

Description

store_memories

Save memories (accepts array)

search_memories

Find relevant memories semantically

get_memories

Retrieve memories by ID (accepts array)

update_memories

Update existing memories

delete_memories

Remove memories (accepts array)

store_handoff

Save session context for later

get_handoff

Restore session context


Usage

Store a memory:

You: "Remember that we use Drizzle ORM for database access" Assistant: [calls store_memories]

Search memories:

You: "What did we decide about the database?" Assistant: [calls search_memories with relevant query]

Session handoffs:

You: "Save context for next session" Assistant: [calls store_handoff with summary, completed items, next steps]

Configuration

Environment variables:

Variable

Default

Description

VECTOR_MEMORY_DB_PATH

.vector-memory/memories.db

Database location

VECTOR_MEMORY_MODEL

Xenova/all-MiniLM-L6-v2

Embedding model

VECTOR_MEMORY_HTTP_PORT

3271

HTTP server port


Development

git clone https://github.com/AerionDyseti/vector-memory-mcp.git cd vector-memory-mcp bun install bun run test # Run all tests bun run dev # Watch mode bun run typecheck # Type checking

See CHANGELOG.md for release history and ROADMAP.md for planned features.


Contributing

Contributions welcome! See issues for areas we'd love help with.

License

MIT - see LICENSE


Built with MCP SDK, LanceDB, and Transformers.js

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/AerionDyseti/vector-memory-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server