code2prompt
local-only server
The server can only run on the client’s local machine because it depends on local resources.
Quick Install ⚡
CLI with cargo 🦀
SDK with pip 🐍
How is it useful?
Core
code2prompt
is a code ingestion tool that streamline the process of creating LLM prompts for code analysis, generation, and other tasks. It works by traversing directories, building a tree structure, and gathering informations about each file. The core library can easily be integrated into other applications.
CLI
code2prompt
command line interface (CLI) was designed for humans to generate prompts directly from your codebase. The generated prompt is automatically copied to your clipboard and can also be saved to an output file. Furthermore, you can customize the prompt generation using Handlebars templates. Check out the provided prompts in the doc !
SDK
code2prompt
software development kit (SDK) offers python binding to the core library. This is perfect for AI agents or automation scripts that want to interact with codebase seamlessly. The SDK is hosted on Pypi and can be installed via pip.
MCP
code2prompt
is also available as a Model Context Protocol (MCP) server, which allows you to run it as a local service. This enables LLMs on steroids by providing them a tool to automatically gather a well-structured context of your codebase.
Documentation 📚
Check our online documentation for detailed instructions
Features
Code2Prompt transforms your entire codebase into a well-structured prompt for large language models. Key features include:
- Automatic Code Processing: Convert codebases of any size into readable, formatted prompts
- Smart Filtering: Include/exclude files using glob patterns and respect
.gitignore
rules - Flexible Templating: Customize prompts with Handlebars templates for different use cases
- Token Tracking: Track token usage to stay within LLM context limits
- Git Integration: Include diffs, logs, and branch comparisons in your prompts
- Developer Experience: Automatic clipboard copy, line numbers, and file organization options
Stop manually copying files and formatting code for LLMs. Code2Prompt handles the tedious work so you can focus on getting insights and solutions from AI models.
Alternative Installation
Refer to the documentation for detailed installation instructions.
Binary releases
Download the latest binary for your OS from Releases.
Source build
Requires:
Star History
License
Licensed under the MIT License, see LICENSE for more information.
Liked the project?
If you liked the project and found it useful, please give it a :star: !
Contribution
Ways to contribute:
- Suggest a feature
- Report a bug
- Fix something and open a pull request
- Help me document the code
- Spread the word
This server cannot be installed
A code ingestion tool that transforms your code into AI-optimized prompts instantly. Gather the relevant context with code2prompt under the hood. Learn more at code2prompt.dev
- How is it useful?
- Documentation 📚
- Features
- Alternative Installation
- Star History
- License
- Liked the project?
- Contribution