Swagger Testcase MCP
swagger-testcase-mcp
MCP server for API testing: generates test cases, validates specs, compares versions, and creates mock data from Swagger/OpenAPI specifications.
Feed it your API spec URL — get structured, categorized test cases (positive, negative, boundary, auth, security, idempotency, pagination) ready for manual testing or import into your TMS.
Why
Writing test cases manually is slow. A single endpoint with 10+ parameters can require 30-50 test cases across positive, negative, boundary, and security scenarios. Multiply that by dozens of endpoints — and you're spending days on routine work instead of actual testing.
Negative and boundary cases get skipped. Under deadline pressure, QA focuses on happy paths. Edge cases, invalid inputs, and security checks are the first to be cut — and the first to cause production incidents.
Specs change faster than test docs, and internal APIs get left behind. New endpoints, renamed fields, changed constraints — keeping test cases in sync is constant overhead. Corporate APIs behind VPN or custom auth make it even harder, since most tooling doesn't support authenticated spec access.
Getting test cases into your TMS takes extra steps. Even if you write great test cases, formatting them for TestRail, Allure, or Postman import is tedious work that adds no value.
This tool automates the baseline: point it at any Swagger/OpenAPI spec (public, internal, localhost), get categorized test cases in seconds, and save ready-to-import files for your TMS or test runner. You focus on business logic and exploratory testing — the tool handles the rest.
Features
Parses Swagger 2.0 and OpenAPI 3.x — JSON and YAML
Smart test case generation — analyzes parameters, schemas, constraints, response codes, and security to produce relevant test cases
8 test categories — positive, negative, boundary, authorization, business logic, security injection, idempotency, pagination & sorting
9 export formats — Markdown, JSON, CSV, Allure CSV, Gherkin, Postman, k6, pytest, TestRail CSV (two templates: Steps and Text)
Spec validation — quality scoring with 11 rules and actionable suggestions
Spec comparison — diff two versions with breaking change detection
Mock data generation — realistic test data from schemas with locale support
Coverage analysis — identify gaps in test coverage with prioritized recommendations
Auto-save to file — exports are saved to the working directory automatically, no manual copy-paste
Configurable generation — filter by category, priority, locale, custom preconditions
Batch generation — filter by tag, HTTP method, path prefix, deprecated status
Works with any MCP client — Claude Desktop, Cursor, VS Code + Cline, etc.
Generated test case categories
Category | What it covers |
✅ Positive | Happy path with required fields, happy path with all fields |
❌ Negative | Missing required fields, invalid types, empty strings, nulls, invalid enums, empty body |
📏 Boundary | Min/max values for numbers, minLength/maxLength for strings |
🔐 Auth | No token, invalid token, insufficient permissions |
💼 Business Logic | 404 for non-existent resources, 409 conflicts, 429 rate limiting, concurrency |
🛡️ Security | XSS, SQL injection, path traversal, command injection, CRLF injection |
🔄 Idempotency | Repeated PUT/DELETE, POST with Idempotency-Key |
📄 Pagination | First/last page, negative offset, zero limit, invalid sort fields |
Installation
npm install -g swagger-testcase-mcpOr run directly with npx:
npx swagger-testcase-mcpConfiguration
Add to claude_desktop_config.json:
{
"mcpServers": {
"swagger-testcase-mcp": {
"command": "npx",
"args": ["-y", "swagger-testcase-mcp"]
}
}
}Add to .cursor/mcp.json:
{
"mcpServers": {
"swagger-testcase-mcp": {
"command": "npx",
"args": ["-y", "swagger-testcase-mcp"]
}
}
}Add to .vscode/mcp.json:
{
"servers": {
"swagger-testcase-mcp": {
"command": "npx",
"args": ["-y", "swagger-testcase-mcp"]
}
}
}Add to Cline MCP settings:
{
"swagger-testcase-mcp": {
"command": "npx",
"args": ["-y", "swagger-testcase-mcp"]
}
}Go to Settings → Tools → AI Assistant → MCP Servers → Add, select stdio and fill in:
Command:
npxArguments:
-y swagger-testcase-mcp
Or add manually to the MCP config file:
{
"servers": {
"swagger-testcase-mcp": {
"command": "npx",
"args": ["-y", "swagger-testcase-mcp"]
}
}
}Add to ~/.windsurf/mcp.json:
{
"mcpServers": {
"swagger-testcase-mcp": {
"command": "npx",
"args": ["-y", "swagger-testcase-mcp"]
}
}
}Quick Start
1. "Load the spec from https://petstore3.swagger.io/api/v3/openapi.json"
2. "Generate test cases for POST /pet"
3. "Export as Postman collection"Here is what the generated output looks like (markdown export):
## Positive Tests (1)
### TC-001: POST /pet — happy path with required fields only
- **Priority:** high
- **Preconditions:** Valid authentication credentials are available
- **Steps:**
1. Send POST request to /pet
2. Include only required parameters/fields with valid values
3. Verify response status and body
- **Input:** `{"name":"Buddy","photoUrls":["https://example.com/photo.jpg"]}`
- **Expected Result:** Returns success response with expected data
- **Expected Status:** 200
## Negative Tests (2)
### TC-002: POST /pet — missing required field "name"
- **Priority:** high
- **Preconditions:** Valid authentication credentials are available
- **Steps:**
1. Send POST request to /pet
2. Omit required field "name"
3. Verify error response
- **Input:** `{"_omit":"name"}`
- **Expected Result:** Returns validation error indicating "name" is required
- **Expected Status:** 422
### TC-003: POST /pet — invalid type for "name"
- **Priority:** medium
- **Preconditions:** Valid authentication credentials are available
- **Steps:**
1. Send POST request to /pet
2. Set "name" to invalid type value: 12345
3. Verify error response
- **Input:** `{"name":12345}`
- **Expected Result:** Returns validation error for invalid type of "name"
- **Expected Status:** 422The generator produces test cases across 8 categories (positive, negative, boundary, auth, business logic, security, idempotency, pagination) -- the example above shows just a small subset. Use generate_test_cases_batch to cover an entire API at once.
For detailed examples of every tool and workflow, see EXAMPLES.md.
Available tools
Tool | Description |
| Load and parse a spec from URL, internal URL, or local file (with optional auth) |
| Get detailed info about a specific endpoint |
| Generate test cases for one endpoint (with optional config) |
| Generate test cases for multiple endpoints (filter by tag/method/path prefix) |
| Export test cases and save to file (markdown, json, csv, allure_csv, gherkin, postman, k6, pytest, testrail_csv, testrail_csv_text) |
| Compare two spec versions, detect breaking changes |
| Validate spec quality (score 0-100, 12 rules) |
| Generate realistic mock data from schemas |
| Analyze test coverage and suggest improvements |
| Clear cached specs and/or test cases |
Generation config
All generation tools accept an optional config parameter:
{
"config": {
"categories": ["positive", "negative", "security"],
"priorityFilter": ["high"],
"maxFieldsForNegative": 15,
"locale": "ru",
"customPreconditions": ["Database seeded with test data"],
"skipDeprecated": true
}
}Parameter | Description |
| Which test categories to generate: |
| Only generate cases with these priorities: |
| Max number of fields for invalid-type negative cases. Default: 10 |
| Language for test case text: |
| Additional preconditions added to all generated cases |
| Skip deprecated endpoints in batch generation. Default: |
Export to file
All exports are automatically saved to the working directory. The filename is generated from the endpoint and format:
test-cases_POST__pet_postman.json
test-cases_GET__users_testrail_csv.csv
test-cases_POST__orders_k6.jsYou can also specify a custom path:
"Export as postman to /path/to/my-collection.json"Format | File extension |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Limitations
The tool generates test cases based on what is described in the OpenAPI spec — parameters, schemas, constraints, response codes, and security definitions. It does not cover:
Business logic — rules like "discount cannot exceed 50% for non-admin users" or "order requires at least one item in stock" are not part of the spec and won't produce test cases automatically
Cross-endpoint dependencies — the tool works with each endpoint in isolation. It won't generate chains like "create user → create order → check status"
Custom validation rules — constraints beyond what OpenAPI supports (e.g., conditional field requirements, cross-field validation) are not detected
How to compensate: since the tool runs inside an MCP client (Claude, Cursor, etc.), you can describe your business rules in the chat and ask the LLM to extend or adjust the generated test cases. The tool provides the baseline, the LLM adds the context.
Tip: The better your OpenAPI spec (constraints, enums, examples, descriptions), the more relevant the generated test cases will be. Use
validate_specto check your spec quality and get suggestions for improvement.
Supported sources
Works with any Swagger/OpenAPI spec — public, internal corporate network, localhost, or local files.
Public and local specs
"Load the spec from https://petstore3.swagger.io/api/v3/openapi.json"
"Load the spec from /path/to/swagger.json"
"Load the spec from ./api-spec.yaml"
"Load the spec from http://localhost:8080/v3/api-docs"Internal corporate APIs
Access specs behind authentication — VPN, SSO, or internal networks:
"Load the spec from https://internal-api.company.com/docs/swagger.json with auth_header Bearer eyJhbGci..."{
"source": "https://internal-api.company.com/docs/swagger.json",
"auth_header": "Bearer eyJhbGciOiJIUzI1NiIs..."
}{
"source": "https://internal-api.company.com/docs/swagger.json",
"auth_header": "Basic dXNlcjpwYXNz"
}{
"source": "https://internal-api.company.com/docs/swagger.json",
"headers": { "X-API-Key": "abc123" }
}Tip: If your spec is behind a corporate VPN, make sure the MCP server process has network access to the internal URL. The server runs locally, so VPN or proxy settings on your machine apply automatically.
Integration with TMS
Allure TestOps
Generate test cases and export as
allure_csvImport via the Allure TestOps migration tool (configure
fieldMappingto match the columns below)Columns: Name, Description, Precondition, Steps, Expected Result, Layer, Tags
Or combine with the Allure TestOps MCP server to create test cases directly via API
Note: CSV import through the Allure TestOps UI was removed in version 25.3.4. Use the migration tool or API integration instead.
"Generate test cases for POST /pet"
"Export as allure_csv"TestRail
Two export formats are available:
testrail_csv— Steps template (one row per step). Expected result is attached to the last step of each test case; intermediate steps have empty expected results.testrail_csv_text— Text template (all steps in one field). Simpler, works with any TestRail configuration.
Steps template (recommended):
Export as
testrail_csvIn TestRail, go to Test Cases → Import from CSV
Select the "Test Case (Steps)" template
Map columns: Title, Section, Type, Priority, Preconditions, Step, Expected Result
"Generate test cases for all endpoints tagged 'users'"
"Export as testrail_csv"Text template:
"Export as testrail_csv_text"Generic CSV (Qase, Zephyr, TestLink, etc.)
Most TMS platforms support CSV import. Use the csv format and map columns to your TMS fields during import.
Qase — map
title,preconditions,steps,expected_result,priorityZephyr Scale — map
Name,Precondition,Test Script (Plain Text),Priority,FolderTestLink — use the standard CSV import wizard
"Generate test cases for GET /users"
"Export as csv"Working with Postman
Import as Postman Collection
Generate test cases and export as a Postman Collection v2.1 that can be imported directly into Postman or run via Newman in CI:
"Load the spec from https://internal-api.company.com/docs/swagger.json with auth_header Bearer eyJ..."
"Generate test cases for POST /users"
"Export as postman"The exported collection includes:
Organized folders by test category (positive, negative, security, etc.)
Pre-configured HTTP method, URL, headers, and request body for each test case
Auto-generated test scripts with expected status code assertions
Import into Postman
Open Postman → Import → drag the exported JSON file
The collection appears with folders for each endpoint
Set up an environment with
baseUrlvariable pointing to your API
Run with Newman (CI)
newman run exported-collection.json \
--environment env.json \
--reporters cli,junitDevelopment
git clone https://github.com/eyaushev/swagger-testcase-mcp.git
cd swagger-testcase-mcp
npm install
npm run build
npm test
npm startLicense
MIT
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/eyaushev/swagger-testcase-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server