Search for:
Why this server?
Provides knowledge graph functionality for managing entities, relations, and observations in memory, which can serve as project memory.
Why this server?
Provides persistent memory integration for chat applications by utilizing a local knowledge graph, useful for remembering project details.
Why this server?
This project is based on the Knowledge Graph Memory Server and retains its core functionality as a memory solution.
Why this server?
Cline MCP integration that allows users to save, search, and format memories with semantic understanding, providing tools to store and retrieve information using vector embeddings for meaning-based search.
Why this server?
Provides sophisticated context management for Claude, enabling persistent context across sessions, project-specific organization, and conversation continuity.
Why this server?
A high-performance, persistent memory system for the Model Context Protocol (MCP) providing vector search capabilities and efficient knowledge storage using libSQL as the backing store.
Why this server?
A custom Memory MCP Server that acts as a cache for Infrastructure-as-Code information, allowing users to store, summarize, and manage notes with a custom URI scheme and simple resource handling.
Why this server?
A very simple vector store that provides capability to watch a list of directories, and automatically index all the markdown, html and text files in the directory to a vector store to enhance context.
Why this server?
A Model Context Protocol server that provides AI assistants with structured access to your Logseq knowledge graph, enabling retrieval, searching, analysis, and creation of content within your personal knowledge base.
Why this server?
Facilitates semantic analysis of chat conversations through vector embeddings and knowledge graphs, offering tools for semantic search, concept extraction, and conversation pattern analysis.