Supports building RAG libraries directly from Git repositories, enabling AI to search and retrieve documentation from version-controlled sources.
Enables building searchable RAG libraries from GitHub repositories, allowing AI assistants to ground responses in repository documentation and code.
Mentioned as an example repository that can be built into a searchable RAG library for AI assistants to query React documentation.
Uses SQLite as the storage backend for portable RAG libraries, with embedded vectors, full-text search indexes, and metadata in single-file databases.
Mentioned as an example library that can be included in collections and used programmatically with TypeScript/JavaScript projects.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Libragensearch our company docs for API rate limiting policies"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Create private, local RAG libraries from any documentation. Libraries are single SQLite files you can share with your team—no cloud, no API keys.
Why libragen?
Stop hallucinations — Give AI agents authoritative docs to cite instead of guessing
Always current — Rebuild when docs change; your AI gets the latest APIs
Private & local — Everything runs on your machine, nothing leaves your network
Shareable — Single
.libragenfiles work anywhere
Packages
Package | Description |
Build and query libraries from the command line | |
Connect AI assistants to your libraries via MCP | |
Programmatic API for embedding, search, and library management |
Quick Start
1. Build a library
2. Connect your AI
Restart your AI tool (Claude Desktop, VS Code, Cursor, etc.). Libraries in your global directory are now searchable.
3. Ask questions
"How do I implement tool use with Claude's API?"
"What's our internal policy on deploying to production?"
"Show me examples of streaming responses from the Anthropic cookbook"
Your AI retrieves relevant documentation and responds with accurate, cited answers—not hallucinated guesses from 2-year-old training data.
What else can you do?
Chat with your Obsidian vault — Tutorial →
Make your company's internal docs searchable — Runbooks, wikis, policies—all queryable by AI
Create a shared library for your team — One
.libragenfile, everyone's on the same pageAuto-build libraries in CI — Use the GitHub Action to generate
.libragenfiles on every push
License
MIT — see LICENSE for details.