Skip to main content
Glama
koorosh-alt

ContextBuilder (ctx)

by koorosh-alt

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
OPENAI_API_KEYNoOpenAI API key for LLM-powered context structuring and extraction.
ANTHROPIC_API_KEYNoAnthropic API key for LLM-powered context structuring and extraction.

Capabilities

Features and capabilities supported by this server

CapabilityDetails
tools
{
  "listChanged": true
}

Tools

Functions exposed to the LLM to take actions

NameDescription
ctx.refresh.app_sources

Refresh and rebuild app context state + graph for a given app. Fetches public web sources, extracts structured content, builds context graph.

ctx.push.starter_context

Push a compact starter context bundle for an app. Returns product overview, key concepts, constraints, and a navigation map for deeper slices.

ctx.pull.context_slice

Pull a targeted context slice for an app based on intent, budget, and required sections. Uses graph-based retrieval and observation masking.

ctx.get.app_state_summary

Get a high-level summary of app context state including refresh status, available sections, and graph stats.

ctx.get.provenance

Get provenance details for a delivered context bundle or slice. Returns source snapshot references and statement-level provenance.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/koorosh-alt/hengam-context-builder'

If you have feedback or need assistance with the MCP directory API, please join our Discord server