Why this server?
This MCP server provides tools for retrieving and processing documentation through vector search, enabling AI assistants to augment their responses with relevant documentation context.
Why this server?
Implementation of an MCP server for the RAG Web Browser Actor. This Actor serves as a web browser for large language models (LLMs) and RAG pipelines, similar to a web search in ChatGPT.
Why this server?
A tool that helps easily register Anthropic's Model Context Protocol (MCP) in Claude Desktop and Cursor, providing RAG functionality, Dify integration, and web search capabilities.
Why this server?
Implementation of Retrieval-Augmented Generation using GroundX and OpenAI, enabling semantic search and document retrieval with Modern Context Processing for enhanced context handling.
Why this server?
An MCP server that fetches real-time documentation for popular libraries like Langchain, Llama-Index, MCP, and OpenAI, allowing LLMs to access updated library information beyond their knowledge cut-off dates.
Why this server?
Enables users to manage and navigate nf-core bioinformatics pipeline repositories, allowing list, search, and explore operations on pipeline configurations, workflows, and modules.