Skip to main content
Glama
context-hub

CTX: Context as Code (CaC) tool

by context-hub

CTX: Your AI Coding Companion

MCP-powered development toolkit that gives AI full access to your codebase

Good morning, LLM

What is CTX?

CTX is a single ~20 MB binary with zero dependencies. No Node.js, no Python, no runtime — just download, connect to your MCP client, and start coding with AI.

Connect it to Claude Desktop, Cursor, Cline, or any MCP-compatible client — and your AI gets direct access to read, write, search, and modify files across your projects.

CTX is designed with Claude Desktop in mind and works best with it. Claude's deep understanding of code combined with CTX's filesystem tools, custom commands, and multi-project support creates a seamless development experience — like having a senior developer who knows your entire codebase sitting right next to you.

Related MCP server: Claude-LMStudio Bridge

Table of Contents

Key Features

🛠 MCP Server — AI Develops Directly in Your Project

CTX provides a built-in MCP server with powerful filesystem tools:

  • Read & write files — AI creates, modifies, and analyzes code directly

  • Search across codebase — text and regex search with context lines

  • PHP structure analysis — class hierarchy, interfaces, and dependencies at a glance

  • Directory exploration — smart filtering by patterns, dates, sizes, and content

⚡ Custom Tools — Turn Any Command into an AI Tool

Define project-specific commands that AI can execute through MCP:

tools: - id: run-tests description: "Run project tests with coverage" type: run commands: - cmd: vendor/bin/phpunit args: [ "--coverage-html", "logs/coverage" ] - id: deploy description: "Deploy to staging" type: run schema: type: object properties: branch: type: string default: "main" commands: - cmd: ./deploy.sh args: [ "{{branch}}" ]

Tests, migrations, linters, deployments — anything your terminal can run, AI can trigger.

📁 Multi-Project Development

Work across multiple microservices simultaneously. AI sees all your projects and can develop cross-cutting features:

projects: - name: backend-api path: ../backend description: "REST API service" - name: auth-service path: ../auth description: "Authentication microservice" - name: shared-lib path: ../packages/shared description: "Shared domain models"

Start a session, ask AI to list available projects, and develop features that span multiple services — all in one conversation.

🎯 Smart Context Generation

Define exactly what context your AI needs. CTX collects code from files, git diffs, GitHub repos, URLs, and more — then structures it into clean markdown documents:

documents: - description: User Authentication System outputPath: auth.md sources: - type: file sourcePaths: [ src/Auth ] filePattern: "*.php" - type: git_diff commit: "last-week"

📐 Declarative Config with JSON Schema

Everything is configured through context.yaml with full JSON Schema support. Your AI assistant can generate and modify these configs for you — just describe what you need.

ctx init # Generate initial config ctx generate # Build context documents ctx server # Start MCP server

Quick Start

Install

Linux / WSL:

curl -sSL https://raw.githubusercontent.com/context-hub/generator/main/download-latest.sh | sh

Windows:

powershell -c "& ([ScriptBlock]::Create((irm 'https://raw.githubusercontent.com/context-hub/generator/main/download-latest.ps1'))) -AddToPath"

Connect to Claude Desktop (or Any MCP Client)

The fastest way — auto-detect OS and configure your MCP client:

ctx mcp:config

Or add manually to your MCP client config:

{ "mcpServers": { "ctx": { "command": "ctx", "args": [ "server" ] } } }

For a specific project:

{ "mcpServers": { "ctx": { "command": "ctx", "args": [ "server", "-c", "/path/to/project" ] } } }

That's it. Your AI assistant now has full access to your project through MCP.

Optional: Generate Context Documents

If you also want to generate static context files for copy-paste workflows:

cd your-project ctx init # Create context.yaml ctx generate # Build markdown contexts

How It Works

CTX operates in two modes that complement each other:

MCP Server Mode — AI interacts with your codebase in real-time:

AI Assistant ←→ CTX MCP Server ←→ Your Project Files ↕ Custom Tools (tests, deploy, lint...) Multiple Projects Context Documents

Context Generation Mode — build structured documents for any LLM:

context.yaml → Sources → Filters → Modifiers → Markdown Documents

Use Cases

🔧 AI-Powered Development (MCP)

Connect CTX to Claude Desktop or Cursor. Ask your AI to explore the codebase, understand architecture, write new features, run tests, and fix issues — all without leaving the conversation.

🏗 Multi-Service Feature Development

Working on a feature that touches multiple microservices? Register all projects, and AI can read code from one service, understand shared models, and implement changes across the entire stack.

📝 Context for Code Review

Generate context documents with recent git diffs, relevant source files, and architecture overview. Share with reviewers or AI assistants for thorough, informed reviews.

🚀 Onboarding

New team member? Generate a comprehensive project overview — architecture, key interfaces, domain models — in seconds. AI can then answer questions about the codebase with full context.

📚 Documentation Generation

Point CTX at your source code with modifiers like php-signature to extract API surfaces, then let AI generate comprehensive documentation.

Full Documentation

For complete documentation, including all features and configuration options:

https://docs.ctxllm.com

Join Our Community

Join Discord

What you'll find:

  • 💡 Share and discover configurations and workflows

  • 🛠️ Get help with setup and advanced usage

  • 🚀 Showcase your AI development workflows

  • 📢 First to know about new releases


License

This project is licensed under the MIT License.

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/context-hub/generator'

If you have feedback or need assistance with the MCP directory API, please join our Discord server