Search for:
Why this server?
Enables AI models to access GitHub repository contents as context, a crucial aspect of RAG.
Why this server?
Provides structured access to markdown documentation, enabling informed code generation through RAG.
Why this server?
Enables AI assistants to interact with file systems, databases, and GitHub repositories, supporting RAG workflows.
Why this server?
Makes documentation or codebases searchable by AI assistants, allowing users to chat with code or docs by simply pointing to a git repository or folder, a perfect fit for RAG.
Why this server?
Open-source MCP implementation providing document management functionality, aiming to replicate Cursor's @Docs functionality, thus enabling RAG.
Why this server?
An MCP server designed to easily dump your codebase context into Large Language Models (LLMs), crucial for providing necessary context to RAG applications.
Why this server?
A Model Context Protocol server that helps large language models index, search, and analyze code repositories with minimal setup, aiding RAG for code-related questions.
Why this server?
Enhances LLM applications with deep autonomous web research capabilities, delivering higher quality information for RAG applications.
Why this server?
Provides integration between Merge API and LLM providers supporting the MCP protocol, allowing natural language interaction with Merge data across HRIS, ATS, and other categories, enabling RAG over HR data.
Why this server?
An open-source MCP implementation connecting AI to data sources, perfect for building RAG applications.