Enables exploration and querying of OpenAPI/Swagger specifications, allowing users to discover endpoints, parameters, request bodies, and response schemas through natural conversation. Supports loading specs from URLs or files and filtering endpoints by tags or search terms.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@OpenAPI MCP Servershow me the endpoints for the petstore API"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
@abumalick/openapi-mcp
MCP (Model Context Protocol) server that enables LLMs to explore OpenAPI specifications. Load any OpenAPI spec and query its endpoints, parameters, request bodies, and response schemas through natural conversation.
Installation
Configuration
You can pre-load OpenAPI specs at startup using the --spec or -s flag with format alias=url.
OpenCode
Add to your OpenCode config (~/.opencode/config.json):
With pre-loaded specs:
Learn more about configuring MCP servers in OpenCode.
Claude Desktop
Add to your Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):
With pre-loaded specs:
Example Workflow
With pre-loaded specs, the LLM can immediately query them:
Without pre-loading, first load the spec:
Tools
openapi_list_specs
List all currently loaded OpenAPI specs. Use this to see what's available.
Returns:
openapi_load
Load an OpenAPI spec from URL or file path. Only needed if spec is not pre-loaded.
openapi_list_endpoints
List endpoints with optional filtering.
openapi_get_endpoint
Get detailed endpoint information.
Supported OpenAPI Versions
OpenAPI 3.0.x
OpenAPI 3.1.x
Note: Swagger 2.0 specs should be converted to OpenAPI 3.x format first.
Development
License
MIT