Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
OPENAI_API_KEYNoOpenAI API key (auto-detected by Memorix if provided).
ANTHROPIC_API_KEYNoAnthropic API key (auto-detected by Memorix if provided).
MEMORIX_EMBEDDINGNoEmbedding provider option. Can be 'api', 'fastembed', 'transformers', or 'off'.off
MEMORIX_LLM_MODELNoModel name for LLM enhanced features (e.g., gpt-4.1-nano).
OPENROUTER_API_KEYNoOpenRouter API key (auto-detected by Memorix if provided).
MEMORIX_LLM_API_KEYNoAPI key for LLM enhanced features.
MEMORIX_LLM_BASE_URLNoCustom endpoint for the LLM API provider.
MEMORIX_LLM_PROVIDERNoLLM provider. One of: openai, anthropic, openrouter, custom.
MEMORIX_PROJECT_ROOTNoThe path to the project root. Usually auto-detected from git remote.
MEMORIX_EMBEDDING_MODELNoModel to use for embeddings.text-embedding-3-small
MEMORIX_EMBEDDING_API_KEYNoAPI key for the embedding provider if using 'api' mode.
MEMORIX_EMBEDDING_BASE_URLNoBase URL for the embedding API.https://api.openai.com/v1
MEMORIX_EMBEDDING_DIMENSIONSNoOptional dimension count for embeddings.

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{
  "listChanged": true
}

Tools

Functions exposed to the LLM to take actions

NameDescription
memorix_store

Store a new observation/memory. Automatically indexed for search. Use type to classify: gotcha (πŸ”΄ critical pitfall), decision (🟀 architecture choice), problem-solution (🟑 bug fix), how-it-works (πŸ”΅ explanation), what-changed (🟒 change), discovery (🟣 insight), why-it-exists (🟠 rationale), trade-off (βš–οΈ compromise), session-request (🎯 original goal). Stored memories persist across sessions and are shared with other IDEs (Cursor, Windsurf, Claude Code, Codex, Copilot, Kiro, Antigravity, Trae) via the same local data directory.

memorix_suggest_topic_key

Suggest a stable topic_key for memory upserts. Use this before memorix_store when you want evolving topics (like architecture decisions, config docs) to update a single observation over time instead of creating duplicates. Returns a key like "architecture/auth-model" or "bug/timeout-in-api-gateway".

memorix_search

Search project memory. Returns a compact index (~50-100 tokens/result). Use memorix_detail to fetch full content for specific IDs. Use memorix_timeline to see chronological context. Searches across all observations stored from any IDE session β€” enabling cross-session and cross-agent context retrieval.

memorix_resolve

Mark observations as resolved (completed/no longer active). Resolved memories are hidden from default search but can still be found with status="all". Use this to mark completed tasks, fixed bugs, or outdated information so they don't pollute future context.

memorix_deduplicate

Scan active memories for duplicates, contradictions, and outdated information using LLM analysis. Automatically resolves redundant memories. Requires LLM to be configured (set MEMORIX_LLM_API_KEY or OPENAI_API_KEY environment variable). Without LLM, falls back to basic similarity-based consolidation.

memorix_timeline

Get chronological context around a specific observation. Shows what happened before and after the anchor observation.

memorix_detail

Fetch full observation details by IDs (~500-1000 tokens each). Always use memorix_search first to find relevant IDs, then fetch only what you need.

memorix_retention

Show memory retention status or archive expired memories. action="report" (default): show active/stale/archive-candidate counts. action="archive": move expired observations to archive file (reversible). Uses exponential decay scoring based on importance, age, and access patterns.

memorix_rules_sync

Scan project for agent rule files (Cursor, Claude Code, Codex, Windsurf, Antigravity, Copilot, Kiro, OpenCode, Trae), deduplicate, detect conflicts, and optionally generate rules for a target agent format. Without target: returns sync status report. With target: generates converted rule files.

memorix_workspace_sync

Migrate your entire workspace environment between AI coding agents (Cursor, Windsurf, Claude Code, Codex, Copilot, Kiro, Antigravity, OpenCode, Trae). Syncs MCP server configs, workflows, rules, and skills across IDEs. Action "scan": detect all workspace configs. Action "migrate": generate configs for target agent (preview only). Action "apply": migrate AND write configs to disk with backup/rollback.

memorix_skills

Memory-driven project skills. Action "list": show all available skills from all agents. Action "generate": auto-generate project-specific skills from observation patterns (gotchas, decisions, how-it-works). Action "inject": return a specific skill's full content for direct use. Generated skills follow the SKILL.md standard and can be synced across Cursor, Windsurf, Claude Code, Codex, Copilot, Kiro, Antigravity, OpenCode, and Trae.

memorix_promote

Promote observations to permanent mini-skills that never decay and are auto-injected at session start. Action "promote": convert observation(s) to a mini-skill. Action "list": show all active mini-skills. Action "delete": remove a mini-skill by ID.

Mini-skills are project-specific specialized knowledge derived from your actual memories β€” gotchas, decisions, fixes that generic online skills cannot provide.

memorix_consolidate

Find and merge similar observations to reduce memory bloat. Uses text similarity to cluster related observations by entity+type, then merges them into single consolidated records. Use action="preview" to see candidates without changing data, action="execute" to merge.

Example: 10 similar gotchas about Windows paths β†’ 1 consolidated gotcha with all facts preserved.

memorix_session_start

Start a new coding session. Returns context from previous sessions so you can resume work seamlessly. Call this at the beginning of a session to track activity and get injected context. Any previous active session for this project will be auto-closed.

memorix_session_end

End a coding session with a structured summary. This summary will be injected into the next session so the next agent can resume work seamlessly.

Recommended summary format:

Goal

[What we were working on]

Discoveries

  • [Technical findings, gotchas, learnings]

Accomplished

  • βœ… [Completed tasks]

  • πŸ”² [Pending for next session]

Relevant Files

  • path/to/file β€” [what changed]

memorix_session_context

Get context from previous coding sessions. Use this after compaction to recover lost context, or to manually review session history. Returns previous session summaries and key observations.

memorix_transfer

Export or import project memories. Action "export": export observations and sessions (JSON or Markdown). Action "import": import from a JSON export (re-assigns IDs, skips duplicate topicKeys).

memorix_dashboard

Launch the Memorix Web Dashboard in the browser. Shows knowledge graph, observations, retention scores, and project stats in a visual interface.

team_manage

Register, unregister, or list agents in the team. Action "join": register this agent (returns agent ID). Action "leave": mark agent inactive, release locks. Action "status": list all agents with roles and capabilities.

team_file_lock

Advisory file locks to prevent conflicting edits. Auto-releases after 10 min TTL. Action "lock": acquire lock. Action "unlock": release lock. Action "status": check lock status.

team_task

Create, claim, complete, or list tasks in the team task board. Supports dependencies. Action "create": create a task. Action "claim": assign to yourself. Action "complete": mark done with result. Action "list": show tasks.

team_message

Send, broadcast, or read messages between agents. Action "send": direct message to one agent. Action "broadcast": message all active agents. Action "inbox": read this agent's inbox.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/AVIDS2/memorix'

If you have feedback or need assistance with the MCP directory API, please join our Discord server