The server provides tools for managing and querying project context in LLM-assisted development with three primary functions:
Generate Repository Overview: Creates a structured overview with directory trees, file statuses, key file contents, and smart outlines for supported languages
Retrieve File Contents: Fetches complete contents of specific files for analysis or text searches
List Modified Files: Tracks files changed since a given timestamp, helping identify updates during conversations
The server also supports customization through profiles to define file inclusion rules and presentation formats for tailored outputs, while managing context to avoid redundant requests.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@llm-contextgenerate context for the authentication module"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
LLM Context
Smart context management for LLM development workflows. Share relevant project files instantly through intelligent selection and rule-based filtering.
The Problem
Getting the right context into LLM conversations is friction-heavy:
Manually finding and copying relevant files wastes time
Too much context hits token limits, too little misses important details
AI requests for additional files require manual fetching
Hard to track what changed during development sessions
Related MCP server: Bifrost VSCode Devtools
The Solution
llm-context provides focused, task-specific project context through composable rules.
For humans using chat interfaces:
For AI agents with CLI access:
For AI agents in chat (MCP tools):
lc_outlines- Generate excerpted context from current rulelc_preview- Validate rule effectiveness before uselc_missing- Fetch specific files/implementations on demand
Note: This project was developed in collaboration with several Claude Sonnets (3.5, 3.6, 3.7, 4.0) and Groks (3, 4), using LLM Context itself to share code during development. All code is heavily human-curated by @restlessronin.
Installation
Quick Start
Human Workflow (Clipboard)
MCP Integration (Recommended)
Add to Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json):
Restart Claude Desktop. Now AI can access additional files during conversations without manual copying.
Agent Workflow (CLI)
AI agents with shell access use llm-context to create focused contexts:
Agent Workflow (MCP)
AI agents in chat environments use MCP tools:
Core Concepts
Rules: Task-Specific Context Descriptors
Rules are YAML+Markdown files that describe what context to provide for a task:
Five Rule Categories
Prompt Rules (: Generate project contexts (e.g.,
lc/prm-developer)Filter Rules (: Control file inclusion (e.g.,
lc/flt-base,lc/flt-no-files)Instruction Rules (: Provide guidelines (e.g.,
lc/ins-developer)Style Rules (: Enforce coding standards (e.g.,
lc/sty-python)Excerpt Rules (: Configure content extraction (e.g.,
lc/exc-base)
Rule Composition
Build complex rules from simpler ones:
Essential Commands
Command | Purpose |
| Initialize project configuration |
| Select files based on current rule |
| Generate and copy context |
| Include prompt instructions |
| Format as separate message |
| No tools (manual workflow) |
| Switch active rule |
| Validate rule selection and size |
| Get code structure excerpts |
| Fetch files/implementations (manual MCP) |
AI-Assisted Rule Creation
Let AI help create focused, task-specific rules. Two approaches depending on your environment:
Claude Skill (Interactive, Claude Desktop/Code)
How it works: Global skill guides you through creating rules interactively. Examines your codebase as needed using MCP tools.
Setup:
Usage:
Claude will:
Use project overview already in context
Examine specific files via
lc-missingas neededAsk clarifying questions about scope
Generate optimized rule (
tmp-prm-<task>.md)Provide validation instructions
Skill documentation (progressively disclosed):
Skill.md- Quick workflow, decision patternsPATTERNS.md- Common rule patternsSYNTAX.md- Detailed referenceEXAMPLES.md- Complete walkthroughsTROUBLESHOOTING.md- Problem solving
Instruction Rules (Works Anywhere)
How it works: Load comprehensive rule-creation documentation into context, work with any LLM.
Usage:
Included documentation:
lc/ins-rule-intro- Introduction and overviewlc/ins-rule-framework- Complete decision framework
Comparison
Aspect | Skill | Instruction Rules |
Setup | Automatic with | Already available |
Interaction | Interactive, uses | Static documentation |
File examination | Automatic via MCP | Manual or via AI |
Best for | Claude Desktop/Code | Any LLM, any environment |
Updates | Automatic with version upgrades | Built-in to rules |
Both require sharing project context first. Both produce equivalent results.
Project Customization
Create Base Filters
Create Development Rule
Deployment Patterns
Choose format based on your LLM environment:
Pattern | Command | Use Case |
System Message |
| AI Studio, etc. |
Single User Message |
| Grok, etc. |
Separate Messages |
| Flexible placement |
Project Files (included) |
| Claude Projects, etc. |
Project Files (searchable) |
| Force into context |
See Deployment Patterns for details.
Key Features
Intelligent Selection: Rules automatically include/exclude appropriate files
Context Validation: Preview size and selection before generation
Code Excerpting: Extract structure while reducing tokens (15+ languages)
MCP Integration: AI accesses additional files without manual intervention
Composable Rules: Build complex contexts from reusable patterns
AI-Assisted Creation: Interactive skill or documentation-based approaches
Agent-Friendly: CLI and MCP interfaces for autonomous operation
Common Workflows
Daily Development (Human)
Focused Task (Human or Agent)
Agent Context Provisioning (CLI)
Agent Context Provisioning (MCP)
Path Format
All paths use project-relative format with project name prefix:
This enables multi-project context composition without path conflicts.
In rules, patterns are project-relative without the prefix:
Learn More
User Guide - Complete documentation with examples
Design Philosophy - Why llm-context exists
Real-world Examples - Using full context effectively
License
Apache License, Version 2.0. See LICENSE for details.