Skip to main content
Glama

ContextEngine MCP Server

ContextEngine MCP Server

A Model Context Protocol (MCP) server that provides an interface to the ContextEngine system - a comprehensive Documentation Driven Development (DDD) methodology. This MCP server enables AI assistants to access ContextEngine's structured workflows, context management tools, and knowledge graph system for documentation-first development processes, eliminating repetitive context provision and ensuring code generation follows documented requirements.

šŸš€ What is ContextEngine?

ContextEngine is an MCP server that implements a comprehensive Documentation Driven Development (DDD) system. It provides structured workflows, context management, and tools that enable AI assistants and humans to collaborate effectively through documentation-first development methodologies.

Core Philosophy

ContextEngine addresses the fundamental challenges of AI-human collaboration in software development:

  • Context Management: Eliminates repetitive context provision by creating persistent documentation repositories

  • Quality Assurance: Ensures code generation follows documented requirements and specifications

  • Scalability: Handles 3x to 30x more information per person than traditional development methods

  • Business Alignment: Maintains clear connection between technical implementation and business value

Features

  • Documentation-Driven Workflows: 9 standardized workflows for different development activities

  • Context Engineering: Dynamic context composition and provision through knowledge graph

  • Structured Documentation: Plan and task documents with hierarchical organization

  • Local Documentation Setup: Automatic creation of organized folder structure and configuration files

  • TypeScript: Full type safety and modern development experience

  • Authentication: Built-in support for API keys and authentication

  • Comprehensive Logging: Structured logging with proper MCP compatibility

šŸ› ļø Installation

Requirements

  • Node.js >= v18.0.0

  • Cursor, Claude Code, VSCode, Windsurf or another MCP Client

Connecting to MCP Clients

"mcp": { "servers": { "context-engine": { "type": "stdio", "command": "npx", "args": ["-y", "context-engine", "--api-key", "YOUR_API_KEY"] } } }

šŸ”Ø Available Tools

ContextEngine provides tools for DDD workflow execution:

1. start_context_engine

Starts the ContextEngine system and automatically sets up the local documentation structure. This tool:

  • Initializes the ContextEngine via API call to establish system awareness

  • Creates local documentation structure with organized folders for DDD workflows

  • Sets up configuration files with default settings and workflow definitions

  • Provides comprehensive feedback about both remote and local setup status

The tool creates a .context-engine directory structure:

.context-engine/ ā”œā”€ā”€ implementation/ # For completed documentation and implementations ā”œā”€ā”€ requirements/ # For requirements and specifications └── config/ # For configuration files ā”œā”€ā”€ settings.json # ContextEngine settings └── workflows.json # Workflow configurations

Response Format: Returns a combined status showing both API response and local setup results with clear emoji indicators:

  • šŸ“ Success: Documentation structure setup completed

  • āš ļø Warning: Setup completed with issues

  • āŒ Error: Setup failed (but API call succeeded)

CLI Arguments

Your MCP server accepts the following CLI flags:

  • --transport <stdio|http> – Transport to use (stdio by default)

  • --port <number> – Port to listen on when using http transport (default 3000)

  • --api-key <key> – API key for authentication (if needed)

  • --server-url <url> – Custom server URL (defaults to https://contextengine.in)

šŸ“š Usage

  1. Start ContextEngine: Use the start_context_engine tool to initialize the DDD system and set up local documentation structure and local documentation setup results

  2. Select Workflow: Choose from 9 available workflows based on your development objective

  3. Execute Workflow: Follow the structured workflow phases to complete your development task

  4. Integrate with AI assistants: Connect to Cursor, VS Code, Claude Code, etc.

šŸ“„ License

MIT

Deploy Server
-
security - not tested
A
license - permissive license
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

Provides real-time access to up-to-date library documentation and code examples for any programming library. Helps AI coding assistants deliver accurate, current information instead of relying on outdated training data.

  1. šŸš€ What is ContextEngine?
    1. Core Philosophy
  2. Features
    1. šŸ› ļø Installation
      1. Requirements
      2. Connecting to MCP Clients
    2. šŸ”Ø Available Tools
      1. 1. start_context_engine
    3. CLI Arguments
      1. šŸ“š Usage
        1. šŸ“„ License

          Related MCP Servers

          • -
            security
            A
            license
            -
            quality
            Serves as a guardian of development knowledge, providing AI assistants with curated access to latest documentation and best practices.
            Last updated -
            89
            77
            MIT License
          • A
            security
            A
            license
            A
            quality
            Facilitates searching and accessing programming resources across platforms like Stack Overflow, MDN, GitHub, npm, and PyPI, aiding LLMs in finding code examples and documentation.
            Last updated -
            6
            39
            AGPL 3.0
            • Apple
          • A
            security
            A
            license
            A
            quality
            Provides up-to-date documentation for 9000+ libraries directly in your AI code editor, enabling accurate code suggestions and eliminating outdated information.
            Last updated -
            1
            209
            156
            MIT License
            • Apple
            • Linux
          • -
            security
            A
            license
            -
            quality
            A sophisticated server that enables AI assistants to automatically analyze codebases and generate comprehensive, professional documentation.
            Last updated -
            1
            MIT License
            • Linux
            • Apple

          View all related MCP servers

          MCP directory API

          We provide all the information about MCP servers via our MCP API.

          curl -X GET 'https://glama.ai/api/mcp/v1/servers/livelifelively/context-engine-mcp'

          If you have feedback or need assistance with the MCP directory API, please join our Discord server