Search for:
Why this server?
This server provides memory management for Cursor IDE, allowing AI assistants to remember and recall information across conversations through a user-friendly interface.
Why this server?
This server manages AI conversation context and personal knowledge bases through the Model Context Protocol (MCP), providing tools for user data, conversation content, and knowledge management.
Why this server?
An MCP server that provides persistent memory capabilities for Claude, offering tiered memory architecture with semantic search, memory consolidation, and integration with the Claude desktop application.
Why this server?
A server that manages conversation context for LLM interactions, storing recent prompts and providing relevant context for each user via REST API endpoints.
Why this server?
This system manages context for language model interactions, allowing the model to remember previous interactions across multiple independent sessions using Gemini API.
Why this server?
A basic implementation of persistent memory using a local knowledge graph. This lets Claude remember information about the user across chats.
Why this server?
Enables communication and coordination between different LLM agents across multiple systems, allowing specialized agents to collaborate on tasks, share context, and coordinate work through a unified platform.
Why this server?
A custom Memory MCP Server that acts as a cache for Infrastructure-as-Code information, allowing users to store, summarize, and manage notes with a custom URI scheme and simple resource handling.
Why this server?
A Model Context Protocol server that enables creation and management of multiple Fireproof JSON databases with CRUD operations, querying capabilities, and cloud synchronization for sharing databases with others.
Why this server?
Flipt’s MCP server allows AI assistants and LLMs to directly interact with your feature flags, segments, and evaluations through a standardized interface.