Skip to main content
Glama
eyaushev

Swagger Testcase MCP

swagger-testcase-mcp

npm version License: MIT Node.js

MCP server for API testing: generates test cases, validates specs, compares versions, and creates mock data from Swagger/OpenAPI specifications.

Feed it your API spec URL — get structured, categorized test cases (positive, negative, boundary, auth, security, idempotency, pagination) ready for manual testing or import into your TMS.

Why

  • Writing test cases manually is slow. A single endpoint with 10+ parameters can require 30-50 test cases across positive, negative, boundary, and security scenarios. Multiply that by dozens of endpoints — and you're spending days on routine work instead of actual testing.

  • Negative and boundary cases get skipped. Under deadline pressure, QA focuses on happy paths. Edge cases, invalid inputs, and security checks are the first to be cut — and the first to cause production incidents.

  • Specs change faster than test docs, and internal APIs get left behind. New endpoints, renamed fields, changed constraints — keeping test cases in sync is constant overhead. Corporate APIs behind VPN or custom auth make it even harder, since most tooling doesn't support authenticated spec access.

  • Getting test cases into your TMS takes extra steps. Even if you write great test cases, formatting them for TestRail, Allure, or Postman import is tedious work that adds no value.

This tool automates the baseline: point it at any Swagger/OpenAPI spec (public, internal, localhost), get categorized test cases in seconds, and save ready-to-import files for your TMS or test runner. You focus on business logic and exploratory testing — the tool handles the rest.

Features

  • Parses Swagger 2.0 and OpenAPI 3.x — JSON and YAML

  • Smart test case generation — analyzes parameters, schemas, constraints, response codes, and security to produce relevant test cases

  • 8 test categories — positive, negative, boundary, authorization, business logic, security injection, idempotency, pagination & sorting

  • 9 export formats — Markdown, JSON, CSV, Allure CSV, Gherkin, Postman, k6, pytest, TestRail CSV (two templates: Steps and Text)

  • Spec validation — quality scoring with 11 rules and actionable suggestions

  • Spec comparison — diff two versions with breaking change detection

  • Mock data generation — realistic test data from schemas with locale support

  • Coverage analysis — identify gaps in test coverage with prioritized recommendations

  • Auto-save to file — exports are saved to the working directory automatically, no manual copy-paste

  • Configurable generation — filter by category, priority, locale, custom preconditions

  • Batch generation — filter by tag, HTTP method, path prefix, deprecated status

  • Works with any MCP client — Claude Desktop, Cursor, VS Code + Cline, etc.

Generated test case categories

Category

What it covers

✅ Positive

Happy path with required fields, happy path with all fields

❌ Negative

Missing required fields, invalid types, empty strings, nulls, invalid enums, empty body

📏 Boundary

Min/max values for numbers, minLength/maxLength for strings

🔐 Auth

No token, invalid token, insufficient permissions

💼 Business Logic

404 for non-existent resources, 409 conflicts, 429 rate limiting, concurrency

🛡️ Security

XSS, SQL injection, path traversal, command injection, CRLF injection

🔄 Idempotency

Repeated PUT/DELETE, POST with Idempotency-Key

📄 Pagination

First/last page, negative offset, zero limit, invalid sort fields

Installation

npm install -g swagger-testcase-mcp

Or run directly with npx:

npx swagger-testcase-mcp

Configuration

Add to claude_desktop_config.json:

{
  "mcpServers": {
    "swagger-testcase-mcp": {
      "command": "npx",
      "args": ["-y", "swagger-testcase-mcp"]
    }
  }
}

Add to .cursor/mcp.json:

{
  "mcpServers": {
    "swagger-testcase-mcp": {
      "command": "npx",
      "args": ["-y", "swagger-testcase-mcp"]
    }
  }
}

Add to .vscode/mcp.json:

{
  "servers": {
    "swagger-testcase-mcp": {
      "command": "npx",
      "args": ["-y", "swagger-testcase-mcp"]
    }
  }
}

Add to Cline MCP settings:

{
  "swagger-testcase-mcp": {
    "command": "npx",
    "args": ["-y", "swagger-testcase-mcp"]
  }
}

Go to Settings → Tools → AI Assistant → MCP Servers → Add, select stdio and fill in:

  • Command: npx

  • Arguments: -y swagger-testcase-mcp

Or add manually to the MCP config file:

{
  "servers": {
    "swagger-testcase-mcp": {
      "command": "npx",
      "args": ["-y", "swagger-testcase-mcp"]
    }
  }
}

Add to ~/.windsurf/mcp.json:

{
  "mcpServers": {
    "swagger-testcase-mcp": {
      "command": "npx",
      "args": ["-y", "swagger-testcase-mcp"]
    }
  }
}

Quick Start

1. "Load the spec from https://petstore3.swagger.io/api/v3/openapi.json"
2. "Generate test cases for POST /pet"
3. "Export as Postman collection"

Here is what the generated output looks like (markdown export):

## Positive Tests (1)

### TC-001: POST /pet — happy path with required fields only

- **Priority:** high
- **Preconditions:** Valid authentication credentials are available
- **Steps:**
  1. Send POST request to /pet
  2. Include only required parameters/fields with valid values
  3. Verify response status and body
- **Input:** `{"name":"Buddy","photoUrls":["https://example.com/photo.jpg"]}`
- **Expected Result:** Returns success response with expected data
- **Expected Status:** 200

## Negative Tests (2)

### TC-002: POST /pet — missing required field "name"

- **Priority:** high
- **Preconditions:** Valid authentication credentials are available
- **Steps:**
  1. Send POST request to /pet
  2. Omit required field "name"
  3. Verify error response
- **Input:** `{"_omit":"name"}`
- **Expected Result:** Returns validation error indicating "name" is required
- **Expected Status:** 422

### TC-003: POST /pet — invalid type for "name"

- **Priority:** medium
- **Preconditions:** Valid authentication credentials are available
- **Steps:**
  1. Send POST request to /pet
  2. Set "name" to invalid type value: 12345
  3. Verify error response
- **Input:** `{"name":12345}`
- **Expected Result:** Returns validation error for invalid type of "name"
- **Expected Status:** 422

The generator produces test cases across 8 categories (positive, negative, boundary, auth, business logic, security, idempotency, pagination) -- the example above shows just a small subset. Use generate_test_cases_batch to cover an entire API at once.

For detailed examples of every tool and workflow, see EXAMPLES.md.

Available tools

Tool

Description

fetch_swagger

Load and parse a spec from URL, internal URL, or local file (with optional auth)

analyze_endpoint

Get detailed info about a specific endpoint

generate_test_cases

Generate test cases for one endpoint (with optional config)

generate_test_cases_batch

Generate test cases for multiple endpoints (filter by tag/method/path prefix)

export_test_cases

Export test cases and save to file (markdown, json, csv, allure_csv, gherkin, postman, k6, pytest, testrail_csv, testrail_csv_text)

compare_specs

Compare two spec versions, detect breaking changes

validate_spec

Validate spec quality (score 0-100, 12 rules)

generate_mock_data

Generate realistic mock data from schemas

suggest_missing_tests

Analyze test coverage and suggest improvements

clear_cache

Clear cached specs and/or test cases

Generation config

All generation tools accept an optional config parameter:

{
  "config": {
    "categories": ["positive", "negative", "security"],
    "priorityFilter": ["high"],
    "maxFieldsForNegative": 15,
    "locale": "ru",
    "customPreconditions": ["Database seeded with test data"],
    "skipDeprecated": true
  }
}

Parameter

Description

categories

Which test categories to generate: positive, negative, boundary, auth, business_logic, security, idempotency, pagination. Default: all

priorityFilter

Only generate cases with these priorities: high, medium, low. Default: all

maxFieldsForNegative

Max number of fields for invalid-type negative cases. Default: 10

locale

Language for test case text: en, ru. Default: en

customPreconditions

Additional preconditions added to all generated cases

skipDeprecated

Skip deprecated endpoints in batch generation. Default: false

Export to file

All exports are automatically saved to the working directory. The filename is generated from the endpoint and format:

test-cases_POST__pet_postman.json
test-cases_GET__users_testrail_csv.csv
test-cases_POST__orders_k6.js

You can also specify a custom path:

"Export as postman to /path/to/my-collection.json"

Format

File extension

markdown

.md

json

.json

csv, allure_csv, testrail_csv, testrail_csv_text

.csv

gherkin

.feature

postman

.json

k6

.js

pytest

.py

Limitations

The tool generates test cases based on what is described in the OpenAPI spec — parameters, schemas, constraints, response codes, and security definitions. It does not cover:

  • Business logic — rules like "discount cannot exceed 50% for non-admin users" or "order requires at least one item in stock" are not part of the spec and won't produce test cases automatically

  • Cross-endpoint dependencies — the tool works with each endpoint in isolation. It won't generate chains like "create user → create order → check status"

  • Custom validation rules — constraints beyond what OpenAPI supports (e.g., conditional field requirements, cross-field validation) are not detected

How to compensate: since the tool runs inside an MCP client (Claude, Cursor, etc.), you can describe your business rules in the chat and ask the LLM to extend or adjust the generated test cases. The tool provides the baseline, the LLM adds the context.

Tip: The better your OpenAPI spec (constraints, enums, examples, descriptions), the more relevant the generated test cases will be. Use validate_spec to check your spec quality and get suggestions for improvement.

Supported sources

Works with any Swagger/OpenAPI spec — public, internal corporate network, localhost, or local files.

Public and local specs

"Load the spec from https://petstore3.swagger.io/api/v3/openapi.json"
"Load the spec from /path/to/swagger.json"
"Load the spec from ./api-spec.yaml"
"Load the spec from http://localhost:8080/v3/api-docs"

Internal corporate APIs

Access specs behind authentication — VPN, SSO, or internal networks:

"Load the spec from https://internal-api.company.com/docs/swagger.json with auth_header Bearer eyJhbGci..."
{
  "source": "https://internal-api.company.com/docs/swagger.json",
  "auth_header": "Bearer eyJhbGciOiJIUzI1NiIs..."
}
{
  "source": "https://internal-api.company.com/docs/swagger.json",
  "auth_header": "Basic dXNlcjpwYXNz"
}
{
  "source": "https://internal-api.company.com/docs/swagger.json",
  "headers": { "X-API-Key": "abc123" }
}

Tip: If your spec is behind a corporate VPN, make sure the MCP server process has network access to the internal URL. The server runs locally, so VPN or proxy settings on your machine apply automatically.

Integration with TMS

Allure TestOps

  1. Generate test cases and export as allure_csv

  2. Import via the Allure TestOps migration tool (configure fieldMapping to match the columns below)

  3. Columns: Name, Description, Precondition, Steps, Expected Result, Layer, Tags

  4. Or combine with the Allure TestOps MCP server to create test cases directly via API

Note: CSV import through the Allure TestOps UI was removed in version 25.3.4. Use the migration tool or API integration instead.

"Generate test cases for POST /pet"
"Export as allure_csv"

TestRail

Two export formats are available:

  • testrail_csv — Steps template (one row per step). Expected result is attached to the last step of each test case; intermediate steps have empty expected results.

  • testrail_csv_text — Text template (all steps in one field). Simpler, works with any TestRail configuration.

Steps template (recommended):

  1. Export as testrail_csv

  2. In TestRail, go to Test Cases → Import from CSV

  3. Select the "Test Case (Steps)" template

  4. Map columns: Title, Section, Type, Priority, Preconditions, Step, Expected Result

"Generate test cases for all endpoints tagged 'users'"
"Export as testrail_csv"

Text template:

"Export as testrail_csv_text"

Most TMS platforms support CSV import. Use the csv format and map columns to your TMS fields during import.

  • Qase — map title, preconditions, steps, expected_result, priority

  • Zephyr Scale — map Name, Precondition, Test Script (Plain Text), Priority, Folder

  • TestLink — use the standard CSV import wizard

"Generate test cases for GET /users"
"Export as csv"

Working with Postman

Import as Postman Collection

Generate test cases and export as a Postman Collection v2.1 that can be imported directly into Postman or run via Newman in CI:

"Load the spec from https://internal-api.company.com/docs/swagger.json with auth_header Bearer eyJ..."
"Generate test cases for POST /users"
"Export as postman"

The exported collection includes:

  • Organized folders by test category (positive, negative, security, etc.)

  • Pre-configured HTTP method, URL, headers, and request body for each test case

  • Auto-generated test scripts with expected status code assertions

Import into Postman

  1. Open Postman → Import → drag the exported JSON file

  2. The collection appears with folders for each endpoint

  3. Set up an environment with baseUrl variable pointing to your API

Run with Newman (CI)

newman run exported-collection.json \
  --environment env.json \
  --reporters cli,junit

Development

git clone https://github.com/eyaushev/swagger-testcase-mcp.git
cd swagger-testcase-mcp
npm install
npm run build
npm test
npm start

License

MIT

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/eyaushev/swagger-testcase-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server