The server provides tools for managing and querying project context in LLM-assisted development with three primary functions:
Generate Repository Overview: Creates a structured overview with directory trees, file statuses, key file contents, and smart outlines for supported languages
Retrieve File Contents: Fetches complete contents of specific files for analysis or text searches
List Modified Files: Tracks files changed since a given timestamp, helping identify updates during conversations
The server also supports customization through profiles to define file inclusion rules and presentation formats for tailored outputs, while managing context to avoid redundant requests.
LLM Context
Reduce friction when providing context to LLMs. Share relevant project files instantly through smart selection and rule-based filtering.
The Problem
Getting project context into LLM chats is tedious:
Manually copying/pasting files takes forever
Hard to identify which files are relevant
Including too much hits context limits, too little misses important details
AI requests for additional files require manual fetching
Repeating this process for every conversation
Related MCP server: Bifrost VSCode Devtools
The Solution
Result: From "I need to share my project" to productive AI collaboration in seconds.
Note: This project was developed in collaboration with several Claude Sonnets (3.5, 3.6, 3.7 and 4.0), as well as Groks (3 and 4), using LLM Context itself to share code during development. All code in the repository is heavily human-curated (by me 😇, @restlessronin).
Installation
Quick Start
Basic Usage
MCP Integration (Recommended)
With MCP, AI can access additional files directly during conversations.
Project Customization
Deployment Patterns
Choose based on your LLM environment:
System Message:
lc-context -p(AI Studio, etc.)Single User Message:
lc-context -p -m(Grok, etc.)Separate Messages:
lc-prompt+lc-context -mProject/Files (included):
lc-context(Claude Projects, etc.)Project/Files (searchable):
lc-context -m(force into context)
See Deployment Patterns in the user guide for details.
Core Commands
Command | Purpose |
| Initialize project configuration |
| Select files based on current rule |
| Generate and copy context |
| Generate context with prompt |
| Send context as separate message |
| No tools (for Project/Files inclusion) |
| Write context to file |
| Switch between rules |
| Handle file and context requests (non-MCP) |
Rule System
Rules use a systematic five-category structure:
Prompt Rules (: Generate project contexts (e.g.,
lc/prm-developer,lc/prm-rule-create)Filter Rules (: Control file inclusion (e.g.,
lc/flt-base,lc/flt-no-files)Instruction Rules (: Provide guidelines (e.g.,
lc/ins-developer,lc/ins-rule-framework)Style Rules (: Enforce coding standards (e.g.,
lc/sty-python,lc/sty-code)Excerpt Rules (: Configure extractions for context reduction (e.g.,
lc/exc-base)
Example Rule
AI-Assisted Rule Creation
Let AI create focused rules for specific tasks. There are two approaches depending on your setup:
Approach 1: Claude Skill (Recommended for Claude Desktop/Code)
How it works: A global Claude Skill helps you create rules interactively. It requires project context (with overview) already shared via llm-context, and uses lc-missing to examine specific files as needed.
Setup:
Workflow:
Claude will:
Use the project overview already in context
Use
lc-missingto examine specific files as needed for deeper analysisAsk clarifying questions about scope and focus
Intelligently select relevant files (5-15 full, 10-30 excerpted)
Generate optimized rule configuration
Save to
.llm-context/rules/tmp-prm-<task-name>.mdProvide instructions for testing the rule
Skill Files:
Skill.md- Quick workflow and patterns (always loaded)PATTERNS.md- Common rule patterns (on demand)SYNTAX.md- Detailed syntax reference (on demand)EXAMPLES.md- Complete walkthroughs (on demand)TROUBLESHOOTING.md- Problem solving (on demand)
Skill Updates: Automatically updated when you upgrade llm-context. Restart Claude to use the new version.
Approach 2: Prompt-Based with Instruction Rules (Works Anywhere)
How it works: You use a project rule that loads comprehensive rule-creation documentation as context.
Setup: No special setup needed - the documentation is built-in.
Usage:
Documentation Included:
lc/ins-rule-intro- Chat-based rule creation introductionlc/ins-rule-framework- Comprehensive decision framework, semantics, and best practices
Comparison
Aspect | Skill | Instruction Rules |
Setup | Automatic with
| Already available |
Requires project context | Yes (overview needed) | Yes (overview needed) |
Interaction | Interactive, multi-turn in Claude | Static documentation in context |
Exploration | Uses
as needed | Manual or via AI requests |
Best for | Claude Desktop/Code users | Any LLM, API, automation |
Both approaches require sharing project context first via lc-context. They produce equivalent results - choose based on your environment and preference.
Workflow Patterns
Daily Development
Focused Tasks
MCP Benefits
Code review: AI examines your changes for completeness/correctness
Additional files: AI accesses initially excluded files when needed
Change tracking: See what's been modified during conversations
Zero friction: No manual file operations during development discussions
Key Features
Smart File Selection: Rules automatically include/exclude appropriate files
Instant Context Generation: Formatted context copied to clipboard in seconds
MCP Integration: AI can access additional files without manual intervention
Systematic Rule Organization: Five-category system for clear rule composition
AI-Assisted Rule Creation: Two approaches - interactive Skill or documentation-based
Code Excerpting: Extractions of significant content to reduce context while preserving structure
Learn More
User Guide - Complete documentation
License
Apache License, Version 2.0. See LICENSE for details.