Exposes any REST API described by a Swagger or OpenAPI specification as MCP tools, allowing AI models to interact with the backend API operations by automatically generating tool definitions from the API documentation.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@OpenAPI to MCPget the details for order 10245"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
OpenAPI to MCP
Standalone proxy that turns any OpenAPI/Swagger-described HTTP API into an MCP (Model Context Protocol) server. It loads the spec at startup, filters operations by include/exclude, and registers one MCP tool per API operation. Tool calls are executed as HTTP requests to the backend API.
Useful when you already have (or want) a REST API with an OpenAPI/Swagger spec: the same spec drives both human-readable API docs and MCP tools for AI clients.
How it works
Load OpenAPI spec from
MCP_OPENAPI_SPEC_URL(preferred) orMCP_OPENAPI_SPEC_FILE.Collect operations (method + path). Filter: if
MCP_INCLUDE_ENDPOINTSis set, keep only those; otherwise drop any inMCP_EXCLUDE_ENDPOINTS. Include has priority over exclude.For each operation create an MCP tool: name =
MCP_TOOL_PREFIX+ path segment (e.g.api_+messages=api_messages). If the same path segment is used by more than one method (e.g. GET and PUT on/pet/{id}), the tool name is made unique by appending the method (e.g.pet_get,pet_put). Input schema from parameters and requestBody (Zod), handler = HTTP call toMCP_API_BASE_URL.
Transport: Streamable HTTP. Endpoint: POST /mcp and GET /mcp.
Environment variables (MCP_ prefix)
Variable | Description | Default |
| Base URL for API requests |
|
| URL of OpenAPI spec (e.g. | - |
| Path to OpenAPI JSON file (used if URL not set) | - |
| Comma-separated | - |
| Comma-separated | - |
| Prefix for tool names (e.g. | (empty) |
| Server name reported to MCP clients |
|
| Port for Streamable HTTP server |
|
| Bind host |
|
At least one of MCP_OPENAPI_SPEC_URL or MCP_OPENAPI_SPEC_FILE must be set.
Run with npm (local)
Copy
.env.exampleto.envand set at least the OpenAPI spec source and API base URL:cp .env.example .env # Edit .env: MCP_OPENAPI_SPEC_URL or MCP_OPENAPI_SPEC_FILE, MCP_API_BASE_URLInstall, build, and start:
npm ci npm run build npm run startThe server listens on
http://<MCP_HOST>:<MCP_PORT>(defaulthttp://0.0.0.0:3100). Connect MCP clients to POST/GEThttp://localhost:3100/mcp(Streamable HTTP).
Ensure the backend API is reachable at MCP_API_BASE_URL and that the OpenAPI spec URL (or file) returns a valid OpenAPI 3.x JSON.
Run with Docker
Image on Docker Hub: evilfreelancer/openapi-to-mcp. Use tag latest or a version tag (e.g. v1.0.0).
Pull and run with env vars (example: spec from URL, API at host):
docker run --rm -p 3100:3100 \ -e MCP_OPENAPI_SPEC_URL=http://host.docker.internal:3000/openapi.json \ -e MCP_API_BASE_URL=http://host.docker.internal:3000 \ evilfreelancer/openapi-to-mcp:latestOn Linux you may need
--add-host=host.docker.internal:host-gatewayor use the host network. Alternatively pass a file path and mount the spec:docker run --rm -p 3100:3100 \ -v $(pwd)/openapi.json:/app/openapi.json:ro \ -e MCP_OPENAPI_SPEC_FILE=/app/openapi.json \ -e MCP_API_BASE_URL=http://host.docker.internal:3000 \ evilfreelancer/openapi-to-mcp:latestTo build the image locally instead:
docker build -t openapi-to-mcp .and useopenapi-to-mcpas the image name in the commands above.
Run with Docker Compose
A minimal docker-compose.yaml is included so you can run the MCP server and optionally point it at an existing API. It uses the image from Docker Hub (evilfreelancer/openapi-to-mcp).
Copy
.env.exampleto.envand set:MCP_OPENAPI_SPEC_URL(e.g. your API’s/openapi.jsonURL)MCP_API_BASE_URL(e.g.http://api:3000if the API runs in another container)
From the project root:
docker compose up -dThe MCP server will be available at
http://localhost:3100/mcp(Streamable HTTP).
To use a local OpenAPI file instead of a URL, set MCP_OPENAPI_SPEC_FILE and mount the file into the container (see docker-compose.yaml comments if present).
Tests
Tests cover: config (env vars, include/exclude, defaults), OpenAPI loader (URL and file, URL over file, error when both unset), and openapi-to-tools (filtering, prefix, handler calling API with success and error). HTTP is mocked (axios-mock-adapter).
Dockerfile
The project includes a Dockerfile (Node 20 Alpine): install deps, build TypeScript, production prune, run node dist/index.js. No dev dependencies or tests in the image. Pre-built images are published to Docker Hub. To build locally:
CI - Docker image on Docker Hub
A GitHub Actions workflow (.github/workflows/docker-publish.yml) runs tests, then builds the image and pushes it to Docker Hub.
Triggers: manually (Actions → "Docker build and push" → Run workflow) or on push of any git tag.
Version: on tag push the image tag equals the git tag (e.g.
v1.0.0); on manual run you can set a version (defaultlatest).Main only: when triggered by a tag, the workflow checks that the tag points to a commit on
main; otherwise the run fails.
Required repository secrets (Settings → Secrets and variables → Actions):
Secret | Description |
| Docker Hub username (image will be |
| Docker Hub access token (recommended) or password |
Similar projects
mcp-openapi-proxy (Python) – MCP server that exposes REST APIs from OpenAPI specs as MCP tools. Low-level mode (one tool per endpoint) or FastMCP mode. Auth and endpoint filtering. Install:
uvx mcp-openapi-proxy.openapi-mcp-proxy (TypeScript) – CLI that turns an OpenAPI service into an MCP server; middleware between OpenAPI and MCP clients.
openapi-mcp-generator (TypeScript) – Generates a full MCP server project from OpenAPI 3.0+ (stdio, SSE, Streamable HTTP), with Zod validation and auth. Install:
npm install -g openapi-mcp-generator.FastMCP + OpenAPI (Python) – OpenAPI integration for FastMCP: auth, route mapping, parameter handling.
openapi-mcp-codegen – Code generator from OpenAPI to MCP server (Apache 2.0).
Swagger MCP (Vizioz) – AI-driven MCP server generation from Swagger/OpenAPI; stores specs locally.
liblab – Cloud service: generate and deploy MCP server from OpenAPI or Postman collection.