Allows users to join the project's Discord channel for updates, support, and community discussions about the code2prompt tool.
Integrates with Git to include diffs, logs, and branch comparisons in generated prompts, and respects .gitignore rules when processing codebases.
Enables accessing repositories for code processing and interacting with GitHub-hosted codebases to generate structured prompts.
Provides Python bindings as an SDK for integration with AI agents or automation scripts, installable via pip for seamless codebase interaction.
Offers core functionality through Rust, allowing installation via Cargo and providing the foundation for the tool's code processing capabilities.

Code2Prompt is a powerful context engineering tool designed to ingest codebases and format them for Large Language Models. Whether you are manually copying context for ChatGPT, building AI agents via Python, or running a MCP server, Code2Prompt streamlines the context preparation process.
⚡ Quick Install
Cargo
To enable optional Wayland support (e.g., for clipboard integration on Wayland-based systems), use the wayland feature flag:
Homebrew
SDK with pip 🐍
Related MCP server: promptz.dev MCP Server
🚀 Quick Start
Once installed, generating a prompt from your codebase is as simple as pointing the tool to your directory.
Basic Usage: Generate a prompt from the current directory and copy it to the clipboard.
Save to file:
🌐 Ecosystem
Code2Prompt is more than just a CLI tool. It is a complete ecosystem for codebase context.
🧱 Core Library | 💻 CLI Tool | 🐍 Python SDK | 🤖 MCP Server |
The internal, high-speed library responsible for secure file traversal, respecting | Designed for humans, featuring both a minimal CLI and an interactive TUI. Generate formatted prompts, track token usage, and outputs the result to your clipboard or stdout. | Provides fast Python bindings to the Rust Core. Ideal for AI Agents, automation scripts, or deep integration into RAG pipelines. Available on PyPI. | Run Code2Prompt as a local service, enabling agentic applications to read your local codebase efficiently without bloating your context window. |
📚 Documentation
Check our online documentation for detailed instructions
✨ Features
Code2Prompt transforms your entire codebase into a well-structured prompt for large language models. Key features include:
Terminal User Interface (TUI): Interactive terminal interface for configuring and generating prompts
Smart Filtering: Include/exclude files using glob patterns and respect
.gitignorerulesFlexible Templating: Customize prompts with Handlebars templates for different use cases
Automatic Code Processing: Convert codebases of any size into readable, formatted prompts
Token Tracking: Track token usage to stay within LLM context limits
Smart File Reading: Simplify reading various file formats for LLMs (CSV, Notebooks, JSONL, etc.)
Git Integration: Include diffs, logs, and branch comparisons in your prompts
Blazing Fast: Built in Rust for high performance and low resource usage
Stop manually copying files and formatting code for LLMs. Code2Prompt handles the tedious work so you can focus on getting insights and solutions from AI models.
Alternative Installation
Refer to the documentation for detailed installation instructions.
Binary releases
Download the latest binary for your OS from Releases.
Source build
Requires:
⭐ Star Gazing
📜 License
Licensed under the MIT License, see LICENSE for more information.
Liked the project?
If you liked the project and found it useful, please give it a :star: !
👥 Contribution
Ways to contribute:
Suggest a feature
Report a bug
Fix something and open a pull request
Help me document the code
Spread the word