Search for:
Why this server?
Provides an improved implementation of persistent memory using a local knowledge graph with a customizable --memory-path, letting Claude remember information across chats.
Why this server?
Offers a basic implementation of persistent memory using a local knowledge graph, allowing Claude to remember user information across chats.
Why this server?
Provides persistent memory integration for chat applications by utilizing a local knowledge graph to remember user information across interactions.
Why this server?
Enhances user interaction through a persistent memory system that remembers information across chats and learns from past errors by utilizing a local knowledge graph and lesson management.
Why this server?
Offers a high-performance, persistent memory system for the Model Context Protocol (MCP) providing vector search capabilities and efficient knowledge storage using libSQL as the backing store.
Why this server?
A high-performance MCP server utilizing libSQL for persistent memory and vector search capabilities, enabling efficient entity management and semantic knowledge storage.
Why this server?
A server that manages conversation context for LLM interactions, storing recent prompts and providing relevant context for each user via REST API endpoints.
Why this server?
Memory Bank Server provides a set of tools and resources for AI assistants to interact with Memory Banks. Memory Banks are structured repositories of information that help maintain context and track progress across multiple sessions.
Why this server?
This advanced memory server facilitates neural memory-based sequence learning and prediction, enhancing code generation and understanding through state maintenance and manifold optimization as inspired by Google Research's framework.
Why this server?
Enables neural memory sequence learning with a memory-augmented model for improved code understanding and generation, featuring state management, novelty detection, and model persistence.