A TypeScript MCP server that allows querying documents using LLMs with context from locally stored repositories and text files through a RAG (Retrieval-Augmented Generation) system.
A server that integrates with Claude Desktop to enable real-time web research capabilities, allowing users to search Google, extract webpage content, and capture screenshots directly from conversations.
Provides RAG capabilities for semantic document search using Qdrant vector database and Ollama/OpenAI embeddings, allowing users to add, search, list, and delete documentation with metadata support.
An MCP server implementation that provides tools for retrieving and processing documentation through vector search, enabling AI assistants to augment their responses with relevant documentation context.
Uses Ollama or OpenAI to generate embeddings.
Docker files included
Enables AI assistants to enhance their responses with relevant documentation through a semantic vector search, offering tools for managing and processing documentation efficiently.