Skip to main content
Glama
koorosh-alt

ContextBuilder (ctx)

by koorosh-alt

ContextBuilder (ctx)

Context-as-a-Service MCP server for Hengam's multi-agent system. Maintains app-isolated, structured, provenance-backed context for Shopify apps and delivers "just enough context" to other agents.

Features

  • App-scoped context: Isolated context for 4 Shopify apps (Notify Me!, Subi, Discounty, Convi)

  • Agentic Graph Memory: Graph-based retrieval with multi-hop traversal, not just vector search

  • Hybrid delivery: Push starter context bundles + Pull targeted context slices

  • Observation Masking: Budget-aware compression with full transparency on what was included/excluded

  • Provenance tracking: Every statement traceable to source URL + snapshot timestamp + content hash

  • Configurable LLM: Provider-agnostic (OpenAI, Anthropic, Gemini) with editable prompt templates

  • Schema-validated: All data objects validated with Zod at boundaries

Quick Start

# Install dependencies
pnpm install

# Set up LLM API key (at least one required for refresh)
export OPENAI_API_KEY=sk-...
# or
export ANTHROPIC_API_KEY=sk-ant-...

# Run the MCP server
pnpm dev

# Run tests
pnpm test

MCP Tools

Tool

Description

ctx.refresh.app_sources

Refresh and rebuild context for an app

ctx.push.starter_context

Push compact starter context bundle

ctx.pull.context_slice

Pull targeted context slice by intent

ctx.get.app_state_summary

Get app state summary + refresh status

ctx.get.provenance

Get provenance for a bundle/slice

Architecture

Ingestion → Extraction → Graph → Delivery
  fetch       summarize    build     push/pull
  parse       extract      traverse  mask
  snapshot    score        validate  provenance

Pipeline Flow

  1. Ingestion: Fetch public web sources (listing, website, help center), parse HTML, create snapshots with content hashes

  2. Extraction: LLM-powered structuring — summarize pages, extract concepts/procedures, score observations, detect conflicts

  3. Graph: Build context graph with nodes (features, procedures, constraints, FAQs, entities) and typed edges (explains, depends_on, resolves, etc.)

  4. Delivery: Serve context via push (starter bundles) or pull (targeted slices) with observation masking and provenance

Configuration

All configuration is in config/:

  • apps.yaml — App source URLs and crawl settings

  • model-profiles.yaml — LLM provider configs (model, temperature, rate limits)

  • settings.yaml — Task bindings, budgets, masking thresholds, graph settings

  • prompt-templates/*.hbs — Handlebars templates for all 8 LLM tasks

Supported Apps (MVP)

App

Listing

Website

Help Center

Notify Me!

apps.shopify.com

notify-me.io

help.notify-me.io

Subi

apps.shopify.com

subi.co

help.subi.co

Discounty

apps.shopify.com

discounty.ai

help.discounty.ai

Convi

apps.shopify.com

conviapp.com

help.conviapp.com

Development

pnpm build          # Compile TypeScript
pnpm dev            # Run with tsx (dev mode)
pnpm test           # Run all tests
pnpm test:unit      # Run unit tests only
pnpm test:contract  # Run MCP contract tests
pnpm lint           # Type check

Requirements Coverage

Implements REQ-CTX-1 through REQ-CTX-38 from the ContextBuilder agent repository spec. See CLAUDE.md for architecture details.

Install Server
A
security – no known vulnerabilities
F
license - not found
A
quality - confirmed to work

Resources

Unclaimed servers have limited discoverability.

Looking for Admin?

If you are the server author, to access and configure the admin panel.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/koorosh-alt/hengam-context-builder'

If you have feedback or need assistance with the MCP directory API, please join our Discord server