Skip to main content
Glama

@abumalick/openapi-mcp

MCP (Model Context Protocol) server that enables LLMs to explore OpenAPI specifications. Load any OpenAPI spec and query its endpoints, parameters, request bodies, and response schemas through natural conversation.

Installation

npm install -g @abumalick/openapi-mcp

Configuration

You can pre-load OpenAPI specs at startup using the --spec or -s flag with format alias=url.

OpenCode

Add to your OpenCode config (~/.opencode/config.json):

{ "$schema": "https://opencode.ai/config.json", "mcp": { "openapi": { "type": "local", "command": ["npx", "-y", "@abumalick/openapi-mcp"], "enabled": true } } }

With pre-loaded specs:

{ "$schema": "https://opencode.ai/config.json", "mcp": { "openapi": { "type": "local", "command": [ "npx", "-y", "@abumalick/openapi-mcp", "--spec", "petstore=https://petstore3.swagger.io/api/v3/openapi.yaml", "--spec", "myapi=https://api.example.com/openapi.json" ], "enabled": true } } }

Learn more about configuring MCP servers in OpenCode.

Claude Desktop

Add to your Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):

{ "mcpServers": { "openapi": { "command": "npx", "args": ["-y", "@abumalick/openapi-mcp"] } } }

With pre-loaded specs:

{ "mcpServers": { "openapi": { "command": "npx", "args": [ "-y", "@abumalick/openapi-mcp", "--spec", "petstore=https://petstore3.swagger.io/api/v3/openapi.yaml" ] } } }

Example Workflow

With pre-loaded specs, the LLM can immediately query them:

User: What specs are available? Assistant: [calls openapi_list_specs] -> shows petstore is loaded User: What endpoints are available for pets? Assistant: [calls openapi_list_endpoints with alias="petstore", tag="pet"] User: Show me details about the GET /pet/{petId} endpoint Assistant: [calls openapi_get_endpoint with alias="petstore", method="GET", path="/pet/{petId}"]

Without pre-loading, first load the spec:

User: Load the Petstore API spec Assistant: [calls openapi_load with source="https://petstore3.swagger.io/api/v3/openapi.yaml", alias="petstore"]

Tools

openapi_list_specs

List all currently loaded OpenAPI specs. Use this to see what's available.

{}

Returns:

{ "specs": [ { "alias": "petstore", "title": "Swagger Petstore", "version": "1.0.0", "endpointCount": 19 } ] }

openapi_load

Load an OpenAPI spec from URL or file path. Only needed if spec is not pre-loaded.

{ "source": "https://api.example.com/openapi.yaml", "alias": "example" }

openapi_list_endpoints

List endpoints with optional filtering.

{ "alias": "example", "tag": "users", "search": "create" }

openapi_get_endpoint

Get detailed endpoint information.

{ "alias": "example", "method": "POST", "path": "/users" }

Supported OpenAPI Versions

  • OpenAPI 3.0.x

  • OpenAPI 3.1.x

Note: Swagger 2.0 specs should be converted to OpenAPI 3.x format first.

Development

# Install dependencies npm install # Build npm run build # Run tests npm test # Development mode npm run dev

License

MIT

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/abumalick/openapi-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server