Skip to main content
Glama

llm-context

by cyberchitta

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault

No arguments

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription
lc-project-context

⛔️ DO NOT USE this tool when you already have the project context. First check if project context is already available in the conversation before making any new requests. Use lc-get-files for retrieving specific files, and only use this tool when a broad repository overview is needed.

Generates a structured repository overview including: 1) Directory tree with file status (✓ full, ○ outline, ✗ excluded) 2) Complete contents of key files 3) Smart outlines highlighting important definitions in supported languages. The output is customizable via profiles that control file inclusion rules and presentation format. The assistant tracks previously retrieved project context in the conversation and checks this history before making new requests.

lc-get-files

⚠️ ALWAYS SEARCH THE ENTIRE CONVERSATION CONTEXT FIRST! DO NOT request files that have already been provided. Retrieves (read-only) complete contents of specified files from the project. Requires the context generation timestamp to check against existing file selections. Files already included with full content will return a message instead of duplicate content. Files included as outlines will be upgraded to full content.

lc-list-modified-files

IMPORTANT: First get the generation timestamp from the project context. Returns a list of paths to files that have been modified since a given timestamp. This is typically used to track which files have changed during the conversation. After getting the list, use lc-get-files to examine the contents of any modified files of interest.

lc-code-outlines

Returns smart outlines highlighting important definitions in all supported code files. Requires the context generation timestamp to check against existing selections. If outlines are already included in the current context, returns a message instead of duplicate content. This provides a high-level overview of code structure without retrieving full file contents. Use lc-get-implementations to retrieve the full implementation of any definition shown in these outlines.

lc-get-implementations

Retrieves complete code implementations of definitions identified in code outlines. Provide a list of file paths and definition names to get their full implementations. This tool works with all supported languages except C and C++.

lc-create-rule-instructions

Call this tool when asked to create a focused rule, minimize context, or generate context for a specific task. Provides step-by-step instructions for creating custom rules that include only the minimum necessary files for a given objective. Use whenever someone requests focused context, targeted rules, or context reduction for a particular purpose.

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/cyberchitta/llm-context.py'

If you have feedback or need assistance with the MCP directory API, please join our Discord server