Retrieve comprehensive context from all memory systems using semantic search to enhance AI assistant capabilities in retaining short-term, long-term, and episodic memory.
An MCP server implementing Recursive Language Models (RLM) to process arbitrarily large contexts through a programmatic probe, recurse, and synthesize loop. It enables LLMs to perform multi-step investigations and evidence-backed extraction across massive file sets without being limited by standard context windows.
An MCP server that allows users to run and visualize systems models using the lethain:systems library, including capabilities to run model specifications and load systems documentation into the context window.
A persistent SQLite-backed storage server that enables Cursor IDE's AI assistant to remember information across sessions with global or project-specific scopes. It supports complete CRUD operations, tag-based organization, and keyword search to manage user preferences and context effectively.