A generic Model Context Protocol framework for building AI-powered applications that provides standardized ways to create MCP servers and clients for integrating LLMs with support for Ollama and Supabase.
Fetches and extracts comprehensive package documentation from multiple programming language ecosystems (JavaScript, Python, Java, etc.) for LLMs like Claude without requiring API keys.
Provides RAG capabilities for semantic document search using Qdrant vector database and Ollama/OpenAI embeddings, allowing users to add, search, list, and delete documentation with metadata support.
An MCP server implementation that provides tools for retrieving and processing documentation through vector search, enabling AI assistants to augment their responses with relevant documentation context.
Uses Ollama or OpenAI to generate embeddings.
Docker files included
Enables AI assistants to enhance their responses with relevant documentation through a semantic vector search, offering tools for managing and processing documentation efficiently.