Search for:
Why this server?
Provides persistent memory using a local knowledge graph, allowing Claude to remember information about the user across chats.
Why this server?
Documents the knowledge base by leveraging your project's documentation and detecting the technologies used in your codebase.
Why this server?
Connects AI models with Obsidian knowledge bases, allowing them to access, manipulate, and manage notes and folder structures.
Why this server?
Obsidian vault connector for Claude Desktop - enables reading and writing Markdown notes using Model Context Protocol (MCP)
Why this server?
Enables AI models to retrieve information from Ragie's knowledge base through a simple 'retrieve' tool.
Why this server?
Connects Claude to your documentation via Inkeep's API, enabling AI-powered interactions with your documentation content.
Why this server?
Provides persistent memory integration for chat applications by utilizing a local knowledge graph to remember user information across interactions.
Why this server?
Provides Claude and other LLMs with read-only access to Hugging Face Hub APIs, enabling interaction with models, datasets, spaces, papers, and collections through natural language.
Why this server?
A Model Context Protocol server that enables Claude and other LLMs to interact with Notion workspaces, providing capabilities like searching, retrieving, creating and updating pages, as well as managing databases.
Why this server?
Provides database interaction and business intelligence capabilities, enabling users to run SQL queries, analyze business data, and automatically generate business insight memos for Microsoft SQL Server databases.