Skip to main content
Glama

OpenAPI to MCP

Standalone proxy that turns any OpenAPI/Swagger-described HTTP API into an MCP (Model Context Protocol) server. It loads the spec at startup, filters operations by include/exclude, and registers one MCP tool per API operation. Tool calls are executed as HTTP requests to the backend API.

Useful when you already have (or want) a REST API with an OpenAPI/Swagger spec: the same spec drives both human-readable API docs and MCP tools for AI clients.

How it works

flowchart LR
  subgraph startup["Startup"]
    A[OpenAPI spec<br/>URL or file] --> B[Load and filter<br/>include or exclude]
    B --> C[N MCP tools<br/>one per operation]
  end

  subgraph runtime["Runtime"]
    D[MCP client] <-->|Streamable HTTP<br/>POST/GET /mcp| E[openapi-to-mcp]
    E <-->|HTTP| F[Backend API]
  end

  C -.->|registered in| E

Logging and Correlation IDs

The server includes comprehensive logging with correlation ID support for request tracking:

  • Correlation ID: Extracted from X-Correlation-ID header (case-insensitive) or auto-generated for each request

  • Log levels: DEBUG, INFO, WARN, ERROR (configurable via MCP_LOG_LEVEL env var, default: INFO)

  • Log format: [correlation_id] LEVEL message with optional context data

  • Request tracking: All logs include correlation ID for tracing requests through the system

For E2E testing, pass X-Correlation-ID header with your request to track it across all logs.

  1. Load OpenAPI spec from MCP_OPENAPI_SPEC (URL starting with http:// or https://, or file path).

  2. Collect operations (method + path). Filter: if MCP_INCLUDE_ENDPOINTS is set, keep only those; otherwise drop any in MCP_EXCLUDE_ENDPOINTS. Include has priority over exclude.

  3. For each operation create an MCP tool: name = MCP_TOOL_PREFIX + path segment (e.g. api_ + messages = api_messages). Path parameters are included in the tool name (e.g. /channels/{username} becomes channels_username). If the same path segment is used by more than one method (e.g. GET and PUT on /pet/{id}), the tool name is made unique by appending the method (e.g. pet_id_get, pet_id_put). Input schema from parameters and requestBody (Zod), handler = HTTP call to MCP_API_BASE_URL.

  4. Load MCP server instructions: by default uses info.description from OpenAPI spec. Optionally, load custom instructions from MCP_INSTRUCTIONS_FILE and combine with OpenAPI description according to MCP_INSTRUCTIONS_MODE (default/replace/append/prepend). If file loading fails, server logs a warning and continues with OpenAPI instructions only.

Transport: Streamable HTTP. Endpoint: POST /mcp and GET /mcp.

Environment variables (MCP_ prefix)

Environment variables are loaded from .env file in the project root (using dotenv). You can also set them directly in your shell environment. See .env.example for a template.

Variable

Description

Default

MCP_API_BASE_URL

Base URL for API requests

http://127.0.0.1:3000

MCP_API_BASIC_AUTH

Basic auth for API requests: username:password. Use when the remote API is protected by HTTP Basic Auth. If both this and MCP_API_BEARER_TOKEN are set, Bearer is used.

-

MCP_API_BEARER_TOKEN

Bearer token for API requests. Use when the remote API expects Authorization: Bearer <token>. Takes precedence over MCP_API_BASIC_AUTH when both set.

-

MCP_OPENAPI_SPEC

OpenAPI spec source: URL (starts with http:// or https://) or file path (e.g. http://api:3000/openapi.json or ./openapi.json). Automatically detects URL vs file.

-

MCP_INCLUDE_ENDPOINTS

Comma-separated method:path (e.g. get:/messages,get:/channels). If set, only these become tools.

-

MCP_EXCLUDE_ENDPOINTS

Comma-separated method:path to exclude. Ignored for endpoints in include.

-

MCP_TOOL_PREFIX

Prefix for tool names (e.g. api_ -> api_messages, api_channels)

(empty)

MCP_SERVER_NAME

Server name reported to MCP clients

openapi-to-mcp

MCP_PORT

Port for Streamable HTTP server

3100

MCP_HOST

Bind host

0.0.0.0

MCP_LOG_LEVEL

Log level: DEBUG, INFO, WARN, ERROR (case-insensitive)

INFO

MCP_INSTRUCTIONS_FILE

Path to custom instructions file (text file with MCP server instructions)

-

MCP_INSTRUCTIONS_MODE

How to combine custom instructions with OpenAPI spec description: default (use only OpenAPI description, ignore custom file), replace (use only custom file, ignore OpenAPI), append (OpenAPI + custom file), prepend (custom file + OpenAPI). Case-insensitive.

default

MCP_CONVERT_HTML_TO_MARKDOWN

Convert HTML tags in operation descriptions to Markdown format. Set to false to disable.

true

MCP_OPENAPI_SPEC must be set. If it starts with http:// or https://, it's treated as a URL; otherwise, it's treated as a file path.

Backward compatibility: MCP_OPENAPI_SPEC_URL and MCP_OPENAPI_SPEC_FILE are still supported but deprecated. MCP_OPENAPI_SPEC takes precedence if set.

Run with npm (local)

  1. Copy .env.example to .env and set at least the OpenAPI spec source and API base URL:

    cp .env.example .env
    # Edit .env: MCP_OPENAPI_SPEC (URL or file path), MCP_API_BASE_URL
  2. Install, build, and start:

    npm ci
    npm run build
    npm run start
  3. The server listens on http://<MCP_HOST>:<MCP_PORT> (default http://0.0.0.0:3100). Connect MCP clients to POST/GET http://localhost:3100/mcp (Streamable HTTP).

Ensure the backend API is reachable at MCP_API_BASE_URL and that the OpenAPI spec URL (or file) returns a valid OpenAPI 3.x JSON.

Using MCP Inspector

To test the server with MCP Inspector:

  1. Start the MCP server (see above).

  2. Run MCP Inspector: npx @modelcontextprotocol/inspector

  3. In the Inspector UI, select "streamable-http" transport type (not STDIO).

  4. Enter the server URL: http://localhost:3100/mcp

  5. Click "Connect".

The server includes CORS support for browser-based MCP clients and maintains sessions for Streamable HTTP transport.

Run with Docker

Image on Docker Hub: evilfreelancer/openapi-to-mcp. Use tag latest or a version tag (e.g. v1.0.0).

  1. Pull and run with env vars (example: spec from URL, API at host):

    docker run --rm -p 3100:3100 \
      -e MCP_OPENAPI_SPEC=http://host.docker.internal:3000/openapi.json \
      -e MCP_API_BASE_URL=http://host.docker.internal:3000 \
      evilfreelancer/openapi-to-mcp:latest

    On Linux you may need --add-host=host.docker.internal:host-gateway or use the host network. Alternatively pass a file path and mount the spec:

    docker run --rm -p 3100:3100 \
      -v $(pwd)/openapi.json:/app/openapi.json:ro \
      -e MCP_OPENAPI_SPEC=/app/openapi.json \
      -e MCP_API_BASE_URL=http://host.docker.internal:3000 \
      evilfreelancer/openapi-to-mcp:latest

    To build the image locally instead: docker build -t openapi-to-mcp . and use openapi-to-mcp as the image name in the commands above.

Run with Docker Compose

A minimal docker-compose.yaml is included so you can run the MCP server and optionally point it at an existing API. It uses the image from Docker Hub (evilfreelancer/openapi-to-mcp).

  1. Copy .env.example to .env and set:

    • MCP_OPENAPI_SPEC (URL like http://api:3000/openapi.json` or file path like ./openapi.json)

    • MCP_API_BASE_URL (e.g. http://api:3000 if the API runs in another container)

  2. From the project root:

    docker compose up -d
  3. The MCP server will be available at http://localhost:3100/mcp (Streamable HTTP).

To use a local OpenAPI file instead of a URL, set MCP_OPENAPI_SPEC to the file path and mount the file into the container (see docker-compose.yaml comments if present).

Tests

npm test

Tests cover: config (env vars, include/exclude, defaults), OpenAPI loader (URL and file detection, error when unset), instructions loader (file loading and combination modes), and openapi-to-tools (filtering, prefix, handler calling API with success and error). HTTP is mocked (axios-mock-adapter).

Dockerfile

The project includes a Dockerfile (Node 20 Alpine): install deps, build TypeScript, production prune, run node dist/index.js. No dev dependencies or tests in the image. Pre-built images are published to Docker Hub. To build locally:

docker build -t openapi-to-mcp .

CI - Docker image on Docker Hub

A GitHub Actions workflow (.github/workflows/docker-publish.yml) runs tests, then builds the image and pushes it to Docker Hub.

  • Triggers: manually (Actions → "Docker build and push" → Run workflow) or on push of any git tag.

  • Version: on tag push the image tag equals the git tag (e.g. v1.0.0); on manual run you can set a version (default latest).

  • Main only: when triggered by a tag, the workflow checks that the tag points to a commit on main; otherwise the run fails.

Required repository secrets (Settings → Secrets and variables → Actions):

Secret

Description

DOCKERHUB_USERNAME

Docker Hub username (image will be DOCKERHUB_USERNAME/openapi-to-mcp)

DOCKERHUB_TOKEN

Docker Hub access token (recommended) or password

Similar projects

  • mcp-openapi-proxy (Python) – MCP server that exposes REST APIs from OpenAPI specs as MCP tools. Low-level mode (one tool per endpoint) or FastMCP mode. Auth and endpoint filtering. Install: uvx mcp-openapi-proxy.

  • openapi-mcp-proxy (TypeScript) – CLI that turns an OpenAPI service into an MCP server; middleware between OpenAPI and MCP clients.

  • openapi-mcp-generator (TypeScript) – Generates a full MCP server project from OpenAPI 3.0+ (stdio, SSE, Streamable HTTP), with Zod validation and auth. Install: npm install -g openapi-mcp-generator.

  • FastMCP + OpenAPI (Python) – OpenAPI integration for FastMCP: auth, route mapping, parameter handling.

  • openapi-mcp-codegen – Code generator from OpenAPI to MCP server (Apache 2.0).

  • Swagger MCP (Vizioz) – AI-driven MCP server generation from Swagger/OpenAPI; stores specs locally.

  • liblab – Cloud service: generate and deploy MCP server from OpenAPI or Postman collection.

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/EvilFreelancer/openapi-to-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server