Enables exploration and querying of OpenAPI/Swagger specifications, allowing users to discover endpoints, parameters, request bodies, and response schemas through natural conversation. Supports loading specs from URLs or files and filtering endpoints by tags or search terms.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@OpenAPI MCP Servershow me the endpoints for the petstore API"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
@abumalick/openapi-mcp
MCP (Model Context Protocol) server that enables LLMs to explore OpenAPI specifications. Load any OpenAPI spec and query its endpoints, parameters, request bodies, and response schemas through natural conversation.
Installation
npm install -g @abumalick/openapi-mcpConfiguration
You can pre-load OpenAPI specs at startup using the --spec or -s flag with format alias=url.
OpenCode
Add to your OpenCode config (~/.opencode/config.json):
{
"$schema": "https://opencode.ai/config.json",
"mcp": {
"openapi": {
"type": "local",
"command": ["npx", "-y", "@abumalick/openapi-mcp"],
"enabled": true
}
}
}With pre-loaded specs:
{
"$schema": "https://opencode.ai/config.json",
"mcp": {
"openapi": {
"type": "local",
"command": [
"npx", "-y", "@abumalick/openapi-mcp",
"--spec", "petstore=https://petstore3.swagger.io/api/v3/openapi.yaml",
"--spec", "myapi=https://api.example.com/openapi.json"
],
"enabled": true
}
}
}Learn more about configuring MCP servers in OpenCode.
Claude Desktop
Add to your Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):
{
"mcpServers": {
"openapi": {
"command": "npx",
"args": ["-y", "@abumalick/openapi-mcp"]
}
}
}With pre-loaded specs:
{
"mcpServers": {
"openapi": {
"command": "npx",
"args": [
"-y", "@abumalick/openapi-mcp",
"--spec", "petstore=https://petstore3.swagger.io/api/v3/openapi.yaml"
]
}
}
}Example Workflow
With pre-loaded specs, the LLM can immediately query them:
User: What specs are available?
Assistant: [calls openapi_list_specs] -> shows petstore is loaded
User: What endpoints are available for pets?
Assistant: [calls openapi_list_endpoints with alias="petstore", tag="pet"]
User: Show me details about the GET /pet/{petId} endpoint
Assistant: [calls openapi_get_endpoint with alias="petstore", method="GET", path="/pet/{petId}"]Without pre-loading, first load the spec:
User: Load the Petstore API spec
Assistant: [calls openapi_load with source="https://petstore3.swagger.io/api/v3/openapi.yaml", alias="petstore"]Tools
openapi_list_specs
List all currently loaded OpenAPI specs. Use this to see what's available.
{}Returns:
{
"specs": [
{ "alias": "petstore", "title": "Swagger Petstore", "version": "1.0.0", "endpointCount": 19 }
]
}openapi_load
Load an OpenAPI spec from URL or file path. Only needed if spec is not pre-loaded.
{
"source": "https://api.example.com/openapi.yaml",
"alias": "example"
}openapi_list_endpoints
List endpoints with optional filtering.
{
"alias": "example",
"tag": "users",
"search": "create"
}openapi_get_endpoint
Get detailed endpoint information.
{
"alias": "example",
"method": "POST",
"path": "/users"
}Supported OpenAPI Versions
OpenAPI 3.0.x
OpenAPI 3.1.x
Note: Swagger 2.0 specs should be converted to OpenAPI 3.x format first.
Development
# Install dependencies
npm install
# Build
npm run build
# Run tests
npm test
# Development mode
npm run devLicense
MIT
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.