Integrations
Built on Node.js runtime, enabling execution of the PRD Creator MCP server across different platforms.
PRD Creator MCP Server
A specialized Model Context Protocol (MCP) server dedicated to creating Product Requirements Documents. This MCP server enables AI systems connected to MCP clients to generate detailed, well-structured product requirement documents through a standardized protocol interface.
- Quick Start
- Features
- Installation
- API Reference
- Provider Configuration
- Integrations
- CLI Usage
- Docker
- Contributing
- Changelog
- Appendix
Quick Start
Via NPX (recommended):
Via Docker:
Configure Providers:
- Copy
.env.example
to.env
and set your API keys and preferred models. - Optionally, update provider credentials at runtime using the
update_provider_config
MCP tool.
Get Help:
Features
- PRD Generator: Create complete PRDs based on product descriptions, user stories, and requirements
- AI-Driven Generation: Generate high-quality PRDs using multiple AI providers
- Multi-Provider Support: Choose from OpenAI, Google Gemini, Anthropic Claude, or local models
- Provider Configuration: Customize provider options for each PRD generation
- Fallback Mechanism: Gracefully falls back to template-based generation when AI is unavailable
- PRD Validator: Validate PRD completeness against industry standards and customizable rule sets
- Template Resources: Access a library of PRD templates for different product types
- MCP Protocol Support: Implements the Model Context Protocol for seamless integration with MCP clients
Installation
Prerequisites
- Node.js v16 or higher
- npm or yarn
Install from source
- Clone the repository:
- Install dependencies:
- Build the project:
- Run locally:
- For development with hot reload:
API Reference
The PRD Creator MCP Server provides the following tools:
generate_prd
Generate a complete PRD document using AI or template-based generation.
Parameters:
productName
: The name of the productproductDescription
: Description of the producttargetAudience
: Description of the target audiencecoreFeatures
: Array of core feature descriptionsconstraints
(optional): Array of constraints or limitationstemplateName
(optional): Template name to use (defaults to "standard")providerId
(optional): Specific AI provider to use (openai, anthropic, gemini, local, template)additionalContext
(optional): Additional context or instructions for the AI providerproviderOptions
(optional): Provider-specific options like temperature, maxTokens, etc.
Example:
validate_prd
Validate a PRD document against best practices.
Parameters:
prdContent
: The PRD content to validatevalidationRules
(optional): Array of validation rule IDs to check
Example:
list_validation_rules
List all available validation rules.
list_ai_providers
List all available AI providers and their availability status.
Example response:
Template Management
The server provides additional tools for template management:
create_template
: Create a new PRD templatelist_templates
: List all available templatesget_template
: Get a specific templateupdate_template
: Update an existing templatedelete_template
: Delete a templateexport_templates
: Export templates to JSONimport_templates
: Import templates from JSONrender_template
: Render a template with placeholders
System Management
get_provider_config
: Get current provider configurationupdate_provider_config
: Update provider configurationhealth_check
: Check system health and provider availabilityget_logs
: Get recent system logsstats
: Get usage statistics
Provider Configuration & Hot Reload
Configuring AI Providers
You can configure provider credentials and models in two ways:
- .env file: Place a
.env
file in your project or working directory. Use.env.example
as a template. All standard AI provider variables (e.g.,OPENAI_API_KEY
,OPENAI_MODEL
, etc.) are supported. - Live protocol tools: Update provider configuration at runtime using the
update_provider_config
tool via your MCP client. These changes are persisted and take effect immediately—no server restart required.
The server will always merge persistent config (from protocol tools) with environment variables, giving precedence to protocol/tool updates.
Hot Reload & Automation
When you update provider settings using either method, changes take effect instantly for all new requests. This enables:
- Seamless automation and scripting via MCP tool interfaces
- Hassle-free credential rotation and model switching
- Dynamic environment support for CI/CD and cloud deployments
Integrations
Claude Desktop
Add to claude_desktop_config.json
:
Glama.ai
Available at: https://glama.ai/mcp/servers/@Saml1211/PRD-MCP-Server
Cursor
Add to your Cursor MCP client configuration:
Roo Code
Add to .roo/mcp.json
:
Cline
Reference prd-creator-mcp
in your MCP workflow definitions.
CLI Usage
Install Globally (optional)
You may also install the MCP server globally to expose the CLI:
Then run:
Command Reference
prd-creator-mcp
Runs the MCP server (STDIO transport). Use directly via npx or as a globally installed CLI for integration with MCP clients and tools.
Uninstall
To remove the global CLI:
CLI Options
View available command line options:
Docker
Building the Docker image
Running with Docker
With environment variables
Contributing
Please read CONTRIBUTING.md and CODE_OF_CONDUCT.md before submitting issues or pull requests.
Changelog
All notable changes to this project are documented in CHANGELOG.md.
Appendix
Useful Links
- GitHub Repository
- Model Context Protocol - Official MCP specification
- MCP Inspector - Testing and debugging tool for MCP servers
- NPM Package - Published npm package
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
A specialized Model Context Protocol server that enables AI systems to generate detailed, well-structured product requirement documents through a standardized interface.
Related MCP Servers
- AsecurityAlicenseAqualityA Model Context Protocol server that integrates with Linear, enabling AI assistants to create, update, search, and comment on issues for project management and issue tracking.Last updated -56PythonApache 2.0
- AsecurityAlicenseAqualityA Model Context Protocol server that enables AI assistants like Claude to interact with Outline document services, supporting document searching, reading, creation, editing, and comment management.Last updated -251PythonMIT License
- -securityFlicense-qualityA Model Context Protocol server that provides persistent task management capabilities for AI assistants, allowing them to create, update, and track tasks beyond their usual context limitations.Last updated -1TypeScript
- -securityFlicense-qualityA Model Context Protocol server that extends AI capabilities by providing file system access and management functionalities to Claude or other AI assistants.Last updated -3TypeScript