Skip to main content
Glama

lc_outlines

Extract key sections from files to provide structured overviews for LLM context sharing, using specified selection rules and directory paths.

Instructions

Returns excerpted content highlighting important sections in all supported files. Args: root_path: Root directory path rule_name: Rule to use for file selection rules timestamp: Context generation timestamp to check against existing selections

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
root_pathYes

Implementation Reference

  • The MCP tool handler for 'lc_outlines', decorated with @mcp.tool() for automatic registration. It creates an ExecutionEnvironment from the root_path and calls the helper commands.get_outlines(env).
    @mcp.tool() def lc_outlines(root_path: str) -> str: """Returns excerpted content highlighting important sections in all supported files. Args: root_path: Root directory path rule_name: Rule to use for file selection rules timestamp: Context generation timestamp to check against existing selections """ env = ExecutionEnvironment.create(Path(root_path)) with env.activate(): return commands.get_outlines(env)
  • Core helper function implementing the outline generation logic. It configures settings for outlines, selects only excerpted files, and invokes ContextGenerator.outlines() to produce the result.
    def get_outlines(env: ExecutionEnvironment) -> str: settings = ContextSettings.create(False, False, False, False) selector = ContextSelector.create(env.config) file_sel_excerpted = selector.select_excerpted_only(env.state.file_selection) return ContextGenerator.create(env.config, file_sel_excerpted, settings, env.tagger).outlines()

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/cyberchitta/llm-context.py'

If you have feedback or need assistance with the MCP directory API, please join our Discord server