Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Generate Manual Test Casesgenerate test cases from docs/login.pdf using P1 rules in CSV format"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
MCP Server: Generate Manual Test Cases
This MCP server provides the generate_testcases tool to generate manual test cases from documentation and rules you supply.
Installation
Running the server
Production:
npm start(runsnode dist/index.js)Development:
npm run dev(runs with ts-node)
The server communicates over stdio (stdin/stdout) and is intended to be started by Cursor or another MCP client.
Configuring in Cursor
Add to your MCP config (e.g. Cursor: Settings → MCP or ~/.cursor/mcp.json):
Example with a real path:
After adding this, Cursor will expose the generate_testcases tool from this server.
Tool: generate_testcases
Generates test cases from documentation and rules.
Parameters
Parameter | Required | Description |
| No | Document content (text). Omit if using |
| No | Path to the document file (txt, md, or PDF). Prefer when available. |
| No |
|
| Yes | Rules for generating test cases (format, priority, scope, language, etc.). |
| No | Default |
| No | Max tokens for the LLM response (default 4096). |
Note: At least one of document_content or document_path is required.
How it works
Read document: From
document_contentor by reading the file atdocument_path. PDF files are supported (text is extracted automatically).Combine with rules: Build a prompt from document + rules.
Generate test cases:
If
use_llm === trueand the client (e.g. Cursor) supports sampling (LLM): the server sends a request to the client so the LLM generates test cases and returns the result.Otherwise (client does not support sampling or
use_llm === false): the server returns the formatted prompt for you to copy and use with an external LLM.
Example rules
"Output test cases as a table: ID, Description, Preconditions, Steps, Expected result, Priority."
"Priority P1 for login/payment flows; P2 for secondary screens."
"Output language: English."
"Each scenario has at most 10 steps; split into multiple scenarios if more complex."
Example usage in Cursor
You can ask the AI in Cursor, for example:
"Use the generate_testcases tool: document_path is
docs/feature-login.md, rules are 'Table format, English, P1 for happy path'."Or paste document content and use
document_contentwithrules.
Generate test cases locally (script)
The script reads doc (txt, md, or PDF) and rules, then either calls OpenAI to generate test cases or prints the formatted prompt.
Environment variables:
Variable | Description |
| Path to requirement document (default: |
| Path to rules file (default: |
|
|
| Output file path. For CSV, default is |
| If set, the script calls OpenAI to generate test cases. |
Examples:
CSV columns match the rules format: 模块, 标题, 前置条件, 步骤描述, 预期结果, test1测试人员, test1测试结果, buglink, PRE测试人员, PRE测试结果, buglink.
Quick try with samples
The samples/ folder contains:
doc-example.md— sample requirement doc (中控后台PC, 6.1–6.4). It references prototype images inImages/(image1.png–image4.png).images/(orImages/) — prototype screenshots; when present next to the doc, the script and MCP prompt include the image list so generated test cases can reference them (e.g. “参照原型 Images/image1.png”).rules-example.txt— sample rules (table columns, 模块 format 代理后台-AGBE/…, language, strict/quality rules).
Generate test cases from doc + images + rules (MCP or script):
MCP: Call tool
generate_testcaseswithdocument_path:samples/doc-example.md,rules: content fromsamples/rules-example.txt, and optionallyoutput_format:"csv"or"markdown".Script:
DOC_PATH=samples/doc-example.md RULES_PATH=samples/rules-example.txt OUTPUT_FORMAT=both npm run generate-testcases(setOPENAI_API_KEYfor LLM generation).
Project structure
License
ISC
depend on doc-example.md and images folder generate tescase for the feature depend on mcp and rules