Search for:
Why this server?
Provides RAG capabilities for semantic document search using Qdrant vector database and Ollama/OpenAI embeddings, allowing users to add, search, list, and delete documentation with metadata support.
Why this server?
A Model Context Protocol server that provides Claude and other LLMs with read-only access to Hugging Face Hub APIs, enabling interaction with models, datasets, spaces, papers, and collections through natural language.
Why this server?
Provides a standardized interface for interacting with Prem AI's language models, RAG capabilities, and document management features.
Why this server?
A Model Context Protocol server providing vector database capabilities through Chroma, enabling semantic document search, metadata filtering, and document management with persistent storage.
Why this server?
A Machine Conversation Protocol server that enables AI agents to perform Retrieval-Augmented Generation by querying a FAISS vector database containing Sui Move language documents.
Why this server?
A Model Context Protocol server that enables LLMs to fetch and process web content in multiple formats (HTML, JSON, Markdown, text) with automatic format detection.
Why this server?
A Model Context Protocol server that automatically reads the Claude Desktop configuration file and presents all available MCP services in an easy-to-copy format at the top of the tools list.
Why this server?
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service, which allows ingestion from various sources and then retrieval of relevant contents from the MCP client.
Why this server?
A basic implementation of persistent memory using a local knowledge graph. This lets Claude remember information about the user across chats.
Why this server?
An MCP server that provides persistent memory capabilities for Claude, offering tiered memory architecture with semantic search, memory consolidation, and integration with the Claude desktop application.