Skip to main content
Glama

Create private, local RAG libraries that ground your AI in real documentationβ€”not 2-year-old training data. No cloud, no API keys, just single files you can share with your whole team.

What's RAG? Retrieval-Augmented Generation lets AI retrieve relevant context before responding, instead of relying solely on training data. libragen packages your docs into searchable libraries your AI can query.

🎯 Why libragen?

  • Ground AI in truth β€” Give your coding agents authoritative docs to cite, dramatically reducing hallucinations

  • Always current β€” Rebuild libraries when docs change; your AI gets the latest APIs, not stale training data

  • Private & local β€” Everything runs on your machine. No API keys, no cloud bills, no data leaving your network

  • Shareable β€” Single .libragen files work anywhere. Share via git, S3, or install from curated collections

✨ Features

  • οΏ½ Hybrid Search β€” Combines vector similarity with BM25 keyword matching

  • πŸ“Š Reranking β€” Optional cross-encoder reranking for improved relevance

  • πŸ“¦ Portable β€” Single-file SQLite databases with embedded vectors

  • 🧠 Smart Chunking β€” Language-aware splitting that respects code boundaries

  • 🌐 Multiple Sources β€” Build from local files or git repositories

  • πŸ€– MCP Native β€” Works directly in Claude Desktop, VS Code, and any MCP client

πŸ“¦ Packages

Package

Description

@libragen/core

Core library for embedding, chunking, storage

@libragen/cli

Command-line interface for building and querying

@libragen/mcp

Model Context Protocol server for AI assistants

πŸš€ Quick Start

Installation

npm install -g @libragen/cli

Build a Library

# From your internal docs libragen build ./internal-api-docs --name internal-api # From a private git repository libragen build https://github.com/your-org/private-docs -o company-docs.libragen # From any public repo libragen build https://github.com/facebook/react -o react.libragen

Query a Library

libragen query "how to authenticate users" -l my-project.libragen

Use with AI Assistants

Install the MCP server globally:

npm install -g @libragen/mcp

Add to your Claude Desktop config (on macOS: ~/Library/Application Support/Claude/claude_desktop_config.json):

{ "mcpServers": { "libragen": { "command": "npx", "args": ["-y", "@libragen/mcp"] } } }

Then install libraries to make them available:

libragen install my-project.libragen

οΏ½ CLI Commands

Command

Description

build <source>

Build a library from files or git repo

query <query>

Search a library for relevant content

info <library>

Display library metadata

list

List installed libraries and collections

install <source>

Install a library or collection

uninstall <name>

Remove an installed library or collection

update [name]

Update installed libraries to newer versions

collection create

Create a collection file

config

Display configuration and paths

completions <action>

Manage shell completions (bash, zsh, fish)

πŸ“š Collections

Collections are JSON files that group libraries together for easy installation:

{ "name": "my-stack", "description": "Libraries for my project", "version": "1.0.0", "items": [ { "library": "https://example.com/react.libragen" }, { "library": "https://example.com/typescript.libragen" }, { "library": "https://example.com/testing.libragen", "required": false }, { "collection": "https://example.com/base-web.json" } ] }

Create a collection:

# Initialize a template libragen collection init my-stack.json # Or create with libraries directly libragen collection create my-stack.json \ -l ./react.libragen \ -l ./typescript.libragen \ -o ./testing.libragen

Install a collection:

libragen install ./my-stack.json # Required libraries only libragen install ./my-stack.json --all # Include optional libraries

Collections support:

  • Nesting β€” Collections can include other collections

  • Deduplication β€” Libraries are only installed once

  • Optional items β€” Mark libraries as "required": false

  • Reference counting β€” Uninstalling removes only unreferenced libraries

βš™οΈ Configuration

Storage Location

By default, libragen stores libraries and configuration in a platform-specific directory:

Platform

Default Location

macOS

~/Library/Application Support/libragen

Windows

%APPDATA%\libragen

Linux

$XDG_DATA_HOME/libragen

(defaults to

~/.local/share/libragen

)

Override this by setting the LIBRAGEN_HOME environment variable:

export LIBRAGEN_HOME=/custom/path/to/libragen

The directory structure is:

$LIBRAGEN_HOME/ libraries/ # Installed .libragen files manifest.json # Tracks installed libraries and collections collections.json # Collection configuration cache/ # Cached collection indexes

πŸ“„ Library Format

A .libragen file is a SQLite database containing:

  • Metadata β€” Library name, version, description, embedding model info

  • Chunks β€” Code/documentation segments with source file info

  • Embeddings β€” Vector representations using Xenova/bge-small-en-v1.5 (384 dims)

  • FTS Index β€” Full-text search index for keyword matching

πŸ“– Programmatic Usage

Use @libragen/core directly in your TypeScript/JavaScript projects:

import { Library, Searcher, Embedder, Reranker } from '@libragen/core'; // Open an existing library and search it const library = await Library.open('./my-docs.libragen'); const embedder = new Embedder(); await embedder.initialize(); const reranker = new Reranker(); await reranker.initialize(); const searcher = new Searcher(embedder, library.getStore(), { reranker }); const results = await searcher.search({ query: 'how do I authenticate?', k: 5, rerank: true, // Use cross-encoder reranking }); for (const result of results) { console.log(`[${result.score.toFixed(3)}] ${result.sourceFile}`); console.log(result.content); } await library.close();
import { Builder } from '@libragen/core'; // Build a library from source files const builder = new Builder(); const result = await builder.build('./docs', { name: 'my-docs', description: 'Internal API documentation', include: ['**/*.md', '**/*.mdx'], }); console.log(`Built ${result.outputPath} with ${result.stats.chunkCount} chunks`);

πŸ› οΈ Development

# Install dependencies npm install # Run tests npm test # Run linting npm run standards # Build all packages npm run build

πŸ—οΈ Architecture

@libragen/cli (build, query, install, manage) β”‚ β–Ό @libragen/core β”œβ”€β”€ Embedder (bge-small-en-v1.5) β”œβ”€β”€ Chunker (language-aware splitting) β”œβ”€β”€ VectorStore (SQLite + sqlite-vec + FTS5) β”œβ”€β”€ Searcher (hybrid search with RRF) β”œβ”€β”€ Reranker (mxbai-rerank-xsmall-v1) β”œβ”€β”€ Library (create/open/validate) β”œβ”€β”€ LibraryManager (install/uninstall/update) β”œβ”€β”€ Manifest (tracks installations) β”œβ”€β”€ CollectionResolver (nested collections) └── Sources (FileSource, GitSource) β”‚ β–Ό @libragen/mcp (MCP server for AI assistants) Tools: libragen_search, libragen_list, libragen_build, libragen_install, libragen_uninstall, libragen_update, libragen_collection

πŸ™ Acknowledgments

libragen uses the following open-source models:

  • BGE-small-en-v1.5 β€” Embedding model by BAAI (MIT License)

  • β€” Reranking model by Mixedbread (Apache-2.0)

If you use libragen in academic work, please cite the underlying models:

@misc{bge_embedding, title={C-Pack: Packaged Resources To Advance General Chinese Embedding}, author={Shitao Xiao and Zheng Liu and Peitian Zhang and Niklas Muennighoff}, year={2023}, eprint={2309.07597}, archivePrefix={arXiv}, primaryClass={cs.CL} } @online{rerank2024mxbai, title={Boost Your Search With The Crispy Mixedbread Rerank Models}, author={Aamir Shakir and Darius Koenig and Julius Lipp and Sean Lee}, year={2024}, url={https://www.mixedbread.ai/blog/mxbai-rerank-v1}, }

πŸ“œ License

MIT β€” see LICENSE for details.

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/libragen/libragen'

If you have feedback or need assistance with the MCP directory API, please join our Discord server