Search for:
Why this server?
This server provides persistent memory capabilities for Claude, offering tiered memory architecture with semantic search, memory consolidation, and integration with the Claude desktop application.
Why this server?
A comprehensive memory management system for Cursor IDE that allows AI assistants to remember, recall, and manage information across conversations through a user-friendly interface.
Why this server?
A basic implementation of persistent memory using a local knowledge graph. This lets Claude remember information about the user across chats.
Why this server?
A system that manages context for language model interactions, allowing the model to remember previous interactions across multiple independent sessions using Gemini API.
Why this server?
Manages AI conversation context and personal knowledge bases through the Model Context Protocol (MCP), providing tools for user data, conversation content, and knowledge management.
Why this server?
A server that manages conversation context for LLM interactions, storing recent prompts and providing relevant context for each user via REST API endpoints.
Why this server?
A Model Context Protocol server providing vector database capabilities through Chroma, enabling semantic document search, metadata filtering, and document management with persistent storage.
Why this server?
A custom Memory MCP Server that acts as a cache for Infrastructure-as-Code information, allowing users to store, summarize, and manage notes with a custom URI scheme and simple resource handling.
Why this server?
SourceSage is an MCP (Model Context Protocol) server that efficiently memorizes key aspects of a codebase—logic, style, and standards—while allowing dynamic updates and fast retrieval. It's designed to be language-agnostic, leveraging the LLM's understanding of code across multiple languages.
Why this server?
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.