Skip to main content
Glama

OpenAPI to MCP

Turn any OpenAPI spec or Postman Collection into an MCP server. Like, instantly.

Docker Pulls License CI

Got a REST API with an OpenAPI spec or a Postman Collection? Cool, now Claude can use it directly. No code required.

Quick Start

docker run -p 8080:8080 procoders/openapi-mcp-ts \ --spec-url https://petstore3.swagger.io/api/v3/openapi.json \ --upstream-url https://petstore3.swagger.io/api/v3

Point Claude Desktop at it:

{ "mcpServers": { "petstore": { "url": "http://localhost:8080/mcp" } } }

Done. Your API is now available as MCP tools. Go grab a coffee.

What's the Deal?

  • OpenAPI 3.0 / 3.1 - We parse your spec and turn each endpoint into an MCP tool

  • Postman Collections - Got a Postman collection instead? Works the same way

  • Auto-detection - We figure out what format you're using. You just point, we parse

  • Zero code - Just point at your spec, we handle the rest

  • Safe by default - DELETE methods are disabled unless you say otherwise

  • Auth built-in - API Key, Bearer, Basic - all supported

  • Filter tools - Use globs to include/exclude specific operations

  • Production-ready - Health checks, graceful shutdown, structured logging

Installation

Docker (the easy way)

docker pull procoders/openapi-mcp-ts:latest

From Source (if you're into that)

git clone https://github.com/procoders/openapi-mcp-ts.git cd openapi-mcp-ts npm install npm run build npm start -- --spec-url <url> --upstream-url <url>

Usage

OpenAPI

# From a URL docker run -p 8080:8080 procoders/openapi-mcp-ts \ --spec-url https://api.example.com/openapi.json \ --upstream-url https://api.example.com # From a local file docker run -p 8080:8080 \ -v ./my-api.yaml:/spec.yaml:ro \ procoders/openapi-mcp-ts \ --spec-file /spec.yaml \ --upstream-url https://api.example.com

Postman Collections

Same deal, just point at your collection. We auto-detect the format:

# From a Postman collection URL docker run -p 8080:8080 procoders/openapi-mcp-ts \ --spec-url https://your-postman-collection.json \ --upstream-url https://api.example.com # From a local collection file docker run -p 8080:8080 \ -v ./my-collection.json:/spec.json:ro \ procoders/openapi-mcp-ts \ --spec-file /spec.json \ --upstream-url https://api.example.com

Want to be explicit? Use --format:

docker run -p 8080:8080 procoders/openapi-mcp-ts \ --spec-file /spec.json \ --format postman \ --upstream-url https://api.example.com

Postman v2.0 and v2.1 collection formats are both supported. Nested folders become tags, path variables (:id) become OpenAPI-style ({id}), and all body types (raw JSON, form-data, urlencoded) just work.

With Auth

# Bearer token docker run -p 8080:8080 \ -e AUTH_TYPE=bearer \ -e AUTH_VALUE="your-token-here" \ procoders/openapi-mcp-ts \ --spec-url https://api.example.com/openapi.json \ --upstream-url https://api.example.com

With a Config File

For more complex setups, use a YAML config:

# config.yaml spec: url: https://api.example.com/openapi.json upstream: baseUrl: https://api.example.com timeout: 30000 headers: X-Request-ID: "{{REQUEST_ID}}" # System variables ftw X-Timestamp: "{{TIMESTAMP}}" tools: include: ["get_*", "list_*"] # Only expose read operations exclude: ["*_admin_*"] # Hide admin stuff autoDisable: methods: ["DELETE"] # Safety first deprecated: true auth: type: apiKey in: header name: X-API-Key value: "${API_KEY}" # Env var interpolation logging: level: info format: json

Then run:

docker run -p 8080:8080 \ -v ./config.yaml:/config.yaml:ro \ -e API_KEY="your-key" \ procoders/openapi-mcp-ts \ --config /config.yaml

CLI Options

Flag

What it does

Default

--spec-url <url>

Fetch spec from URL

-

--spec-file <path>

Load spec from local file

-

--format <format>

Force format: openapi or postman

auto

--upstream-url <url>

Base URL for API requests

-

--port <port>

Server port

8080

--host <host>

Host to bind

0.0.0.0

--config <path>

Config file path

-

--include-tools <glob>

Only include matching tools

-

--exclude-tools <glob>

Exclude matching tools

-

--log-level <level>

debug/info/warn/error

info

--log-format <format>

json/pretty

json

Environment Variables

Everything can be set via env vars too:

Variable

What it does

SPEC_URL

Spec URL (OpenAPI or Postman)

SPEC_FILE

Path to spec file

SPEC_FORMAT

Force format: openapi or postman

UPSTREAM_URL

Upstream API base URL

PORT

Server port

LOG_LEVEL

Log level

AUTH_TYPE

none/apiKey/bearer/basic

AUTH_NAME

Header or query param name

AUTH_VALUE

The actual auth value

INCLUDE_TOOLS

Comma-separated include patterns

EXCLUDE_TOOLS

Comma-separated exclude patterns

Endpoints

Endpoint

What's there

GET /health

Health check (for load balancers)

GET /tools

List all available tools as JSON

POST /mcp

The MCP protocol endpoint

Dynamic Tool Filtering

Different AI agents need different tools? No problem:

http://localhost:8080/mcp?tools=get_user,list_users

Only those tools will be available in that session. Pretty handy.

System Variables

You can use these in your upstream headers config:

Variable

Value

{{TIMESTAMP}}

Unix timestamp in ms

{{REQUEST_ID}}

UUID for request tracing

{{ISO_DATE}}

ISO 8601 formatted date

{{USER_IP}}

Client IP address

Tool Naming

We convert your operationId to snake_case:

  • getPetByIdget_pet_by_id

  • listUserslist_users

  • createNewOrdercreate_new_order

No operationId? We generate from method + path:

  • GET /pets/{id}get_pets_id

For Postman collections, the request name becomes the tool name:

  • "Get All Users"get_all_users

  • "Create New Post"create_new_post

Safety Stuff

Auto-disabled Methods

DELETE operations are disabled by default. Enable them explicitly if you need them.

Tool Count Warning

We'll warn you if you enable more than 10 tools. LLMs tend to get confused with too many options.

Development

npm install npm run dev -- --spec-url https://petstore3.swagger.io/api/v3/openapi.json \ --upstream-url https://petstore3.swagger.io/api/v3 # Other commands npm run build # Compile TypeScript npm run typecheck # Type check only npm run lint # ESLint

Don't Want to Self-Host?

Check out MCPize - we'll host it for you. Deploy from your OpenAPI spec in 60 seconds, no infrastructure needed.

Get Started Free →

Contributing

See CONTRIBUTING.md. We're friendly, promise.

License

Apache 2.0 - do what you want, just keep the license.

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/procoders/openapi-mcp-ts'

If you have feedback or need assistance with the MCP directory API, please join our Discord server