PRD Creator MCP Server

Integrations

  • Built on Node.js runtime, enabling execution of the PRD Creator MCP server across different platforms.

PRD Creator MCP Server

A specialized Model Context Protocol (MCP) server dedicated to creating Product Requirements Documents. This MCP server enables AI systems connected to MCP clients to generate detailed, well-structured product requirement documents through a standardized protocol interface.


Quick Start

Via NPX (recommended):

npx -y prd-creator-mcp

Via Docker:

docker pull saml1211/prd-creator-mcp docker run -i --rm saml1211/prd-creator-mcp

Configure Providers:

  • Copy .env.example to .env and set your API keys and preferred models.
  • Optionally, update provider credentials at runtime using the update_provider_config MCP tool.

Get Help:

npx prd-creator-mcp --help

Features

  • PRD Generator: Create complete PRDs based on product descriptions, user stories, and requirements
  • AI-Driven Generation: Generate high-quality PRDs using multiple AI providers
  • Multi-Provider Support: Choose from OpenAI, Google Gemini, Anthropic Claude, or local models
  • Provider Configuration: Customize provider options for each PRD generation
  • Fallback Mechanism: Gracefully falls back to template-based generation when AI is unavailable
  • PRD Validator: Validate PRD completeness against industry standards and customizable rule sets
  • Template Resources: Access a library of PRD templates for different product types
  • MCP Protocol Support: Implements the Model Context Protocol for seamless integration with MCP clients

Installation

Prerequisites

  • Node.js v16 or higher
  • npm or yarn

Install from source

  1. Clone the repository:
git clone https://github.com/Saml1211/prd-mcp-server.git cd prd-mcp-server
  1. Install dependencies:
npm install
  1. Build the project:
npm run build
  1. Run locally:
npm start
  1. For development with hot reload:
npm run dev

API Reference

The PRD Creator MCP Server provides the following tools:

generate_prd

Generate a complete PRD document using AI or template-based generation.

Parameters:

  • productName: The name of the product
  • productDescription: Description of the product
  • targetAudience: Description of the target audience
  • coreFeatures: Array of core feature descriptions
  • constraints (optional): Array of constraints or limitations
  • templateName (optional): Template name to use (defaults to "standard")
  • providerId (optional): Specific AI provider to use (openai, anthropic, gemini, local, template)
  • additionalContext (optional): Additional context or instructions for the AI provider
  • providerOptions (optional): Provider-specific options like temperature, maxTokens, etc.

Example:

{ "productName": "TaskMaster Pro", "productDescription": "A task management application that helps users organize and prioritize their work efficiently.", "targetAudience": "Busy professionals and teams who need to manage multiple projects and deadlines.", "coreFeatures": [ "Task creation and management", "Priority setting", "Due date tracking", "Team collaboration" ], "constraints": [ "Must work offline", "Must support mobile and desktop platforms" ], "templateName": "comprehensive", "providerId": "openai", "additionalContext": "Focus on enterprise features and security", "providerOptions": { "temperature": 0.5, "maxTokens": 4000 } }

validate_prd

Validate a PRD document against best practices.

Parameters:

  • prdContent: The PRD content to validate
  • validationRules (optional): Array of validation rule IDs to check

Example:

{ "prdContent": "# My Product\n\n## Introduction\n...", "validationRules": ["has-introduction", "minimum-length"] }

list_validation_rules

List all available validation rules.

list_ai_providers

List all available AI providers and their availability status.

Example response:

[ { "id": "openai", "name": "OpenAI", "available": true }, { "id": "anthropic", "name": "Anthropic Claude", "available": false }, { "id": "gemini", "name": "Google Gemini", "available": false }, { "id": "local", "name": "Local Model", "available": false }, { "id": "template", "name": "Template-based (No AI)", "available": true } ]

Template Management

The server provides additional tools for template management:

  • create_template: Create a new PRD template
  • list_templates: List all available templates
  • get_template: Get a specific template
  • update_template: Update an existing template
  • delete_template: Delete a template
  • export_templates: Export templates to JSON
  • import_templates: Import templates from JSON
  • render_template: Render a template with placeholders

System Management

  • get_provider_config: Get current provider configuration
  • update_provider_config: Update provider configuration
  • health_check: Check system health and provider availability
  • get_logs: Get recent system logs
  • stats: Get usage statistics

Provider Configuration & Hot Reload

Configuring AI Providers

You can configure provider credentials and models in two ways:

  • .env file: Place a .env file in your project or working directory. Use .env.example as a template. All standard AI provider variables (e.g., OPENAI_API_KEY, OPENAI_MODEL, etc.) are supported.
  • Live protocol tools: Update provider configuration at runtime using the update_provider_config tool via your MCP client. These changes are persisted and take effect immediately—no server restart required.

The server will always merge persistent config (from protocol tools) with environment variables, giving precedence to protocol/tool updates.

Hot Reload & Automation

When you update provider settings using either method, changes take effect instantly for all new requests. This enables:

  • Seamless automation and scripting via MCP tool interfaces
  • Hassle-free credential rotation and model switching
  • Dynamic environment support for CI/CD and cloud deployments

Integrations

Claude Desktop

Add to claude_desktop_config.json:

{ "mcpServers": { "prd-creator": { "command": "npx", "args": ["-y", "prd-creator-mcp"] } } }

Glama.ai

Available at: https://glama.ai/mcp/servers/@Saml1211/PRD-MCP-Server

Cursor

Add to your Cursor MCP client configuration:

{ "mcpServers": { "prd-creator": { "command": "npx", "args": ["-y", "prd-creator-mcp"] } } }

Roo Code

Add to .roo/mcp.json:

{ "mcpServers": { "prd-creator-mcp": { "command": "npx", "args": ["-y", "prd-creator-mcp"] } } }

Cline

Reference prd-creator-mcp in your MCP workflow definitions.

CLI Usage

Install Globally (optional)

You may also install the MCP server globally to expose the CLI:

npm install -g prd-creator-mcp

Then run:

prd-creator-mcp

Command Reference

  • prd-creator-mcp Runs the MCP server (STDIO transport). Use directly via npx or as a globally installed CLI for integration with MCP clients and tools.

Uninstall

To remove the global CLI:

npm uninstall -g prd-creator-mcp

CLI Options

View available command line options:

npx prd-creator-mcp --help

Docker

Building the Docker image

docker build -t prd-creator-mcp .

Running with Docker

docker run -i --rm prd-creator-mcp

With environment variables

docker run -i --rm -e OPENAI_API_KEY=your_key_here prd-creator-mcp

Contributing

Please read CONTRIBUTING.md and CODE_OF_CONDUCT.md before submitting issues or pull requests.

Changelog

All notable changes to this project are documented in CHANGELOG.md.

Appendix

-
security - not tested
A
license - permissive license
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

A specialized Model Context Protocol server that enables AI systems to generate detailed, well-structured product requirement documents through a standardized interface.

  1. Quick Start
    1. Features
      1. Installation
        1. Prerequisites
        2. Install from source
      2. API Reference
        1. generate_prd
        2. validate_prd
        3. list_validation_rules
        4. list_ai_providers
        5. Template Management
        6. System Management
      3. Provider Configuration & Hot Reload
        1. Configuring AI Providers
        2. Hot Reload & Automation
      4. Integrations
        1. Claude Desktop
        2. Glama.ai
        3. Cursor
        4. Roo Code
        5. Cline
      5. CLI Usage
        1. Install Globally (optional)
        2. Command Reference
        3. Uninstall
        4. CLI Options
      6. Docker
        1. Building the Docker image
        2. Running with Docker
        3. With environment variables
      7. Contributing
        1. Changelog
          1. Appendix
            1. Useful Links

          Related MCP Servers

          • A
            security
            A
            license
            A
            quality
            A Model Context Protocol server that integrates with Linear, enabling AI assistants to create, update, search, and comment on issues for project management and issue tracking.
            Last updated -
            5
            6
            Python
            Apache 2.0
          • A
            security
            A
            license
            A
            quality
            A Model Context Protocol server that enables AI assistants like Claude to interact with Outline document services, supporting document searching, reading, creation, editing, and comment management.
            Last updated -
            25
            1
            Python
            MIT License
          • -
            security
            F
            license
            -
            quality
            A Model Context Protocol server that provides persistent task management capabilities for AI assistants, allowing them to create, update, and track tasks beyond their usual context limitations.
            Last updated -
            1
            TypeScript
          • -
            security
            F
            license
            -
            quality
            A Model Context Protocol server that extends AI capabilities by providing file system access and management functionalities to Claude or other AI assistants.
            Last updated -
            3
            TypeScript
            • Apple

          View all related MCP servers

          ID: ltinqqw9a4