Skip to main content
Glama

English | 中文

One Memory. All CLIs. Never Compacted. Exact Search.

Session history management for AI coding assistants. Never lose your conversations again.

Features

  • On-demand search - You control when to search; automatic injection is opt-in

  • Original preservation - Raw messages always kept; summaries are optional layers

  • Multi-CLI support - Claude Code, Codex, OpenCode, Gemini in one database

  • Powerful search - Full-text (FTS5) + semantic vectors + hybrid ranking

  • MCP integration - Search directly from your AI CLI

  • REST API - Integrate into any workflow

  • Local storage - All data stays on your machine

Quick Start

Full

brew install vimo-ai/tap/memex

# Verify server is running
curl http://localhost:10013/health

Lite

Zero-dependency CLI, reads local session data directly:

brew install vimo-ai/tap/memex-lite

memex search "anything you want"
memex list -n 10

Docker

macOS / Linux:

docker run -d -p 10013:10013 \
  -v ~/.vimo:/data \
  -v ~/.claude/projects:/claude:ro \
  -v ~/.codex:/codex:ro \                              # 可选: Codex
  -v ~/.local/share/opencode:/opencode:ro \            # 可选: OpenCode
  -v ~/.gemini/tmp:/gemini:ro \                        # 可选: Gemini
  -e OLLAMA_HOST=http://host.docker.internal:11434 \   # 可选: 本机 Ollama (Docker Desktop)
  ghcr.io/vimo-ai/memex:latest

Windows (PowerShell):

docker run -d -p 10013:10013 `
  -v "$env:USERPROFILE\.vimo:/data" `
  -v "$env:USERPROFILE\.claude\projects:/claude:ro" `
  -v "$env:USERPROFILE\.codex:/codex:ro" `             # 可选: Codex
  -v "$env:LOCALAPPDATA\opencode:/opencode:ro" `       # 可选: OpenCode
  -v "$env:USERPROFILE\.gemini\tmp:/gemini:ro" `       # 可选: Gemini
  -e OLLAMA_HOST=http://host.docker.internal:11434 `   # 可选: 本机 Ollama (Docker Desktop)
  ghcr.io/vimo-ai/memex:latest

Binary downloads available at Releases.

Configure MCP

# Claude Code
claude mcp add memex -- npx -y mcp-remote http://localhost:10013/api/mcp

# Codex
codex mcp add memex -- npx -y mcp-remote http://localhost:10013/api/mcp

# Gemini
gemini mcp add --transport http memex http://localhost:10013/api/mcp

# OpenCode - edit ~/.config/opencode/opencode.json
# { "mcp": { "memex": { "type": "remote", "url": "http://localhost:10013/api/mcp" } } }

Then search in your AI CLI:

use memex search "anything you want"

Hooks (Optional)

Auto-inject relevant memory context into Claude Code sessions. See Hook Documentation for setup.

Documentation

https://vimoai.dev/docs/memex

Community

Discord

Join our Discord server for discussions, support, and updates.

-
security - not tested
A
license - permissive license
-
quality - not tested

Resources

Unclaimed servers have limited discoverability.

Looking for Admin?

If you are the server author, to access and configure the admin panel.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/vimo-ai/memex'

If you have feedback or need assistance with the MCP directory API, please join our Discord server