# @abumalick/openapi-mcp
MCP (Model Context Protocol) server that enables LLMs to explore OpenAPI specifications. Load any OpenAPI spec and query its endpoints, parameters, request bodies, and response schemas through natural conversation.
## Installation
```bash
npm install -g @abumalick/openapi-mcp
```
## Configuration
You can pre-load OpenAPI specs at startup using the `--spec` or `-s` flag with format `alias=url`.
### OpenCode
Add to your OpenCode config (`~/.opencode/config.json`):
```json
{
"$schema": "https://opencode.ai/config.json",
"mcp": {
"openapi": {
"type": "local",
"command": ["npx", "-y", "@abumalick/openapi-mcp"],
"enabled": true
}
}
}
```
With pre-loaded specs:
```json
{
"$schema": "https://opencode.ai/config.json",
"mcp": {
"openapi": {
"type": "local",
"command": [
"npx", "-y", "@abumalick/openapi-mcp",
"--spec", "petstore=https://petstore3.swagger.io/api/v3/openapi.yaml",
"--spec", "myapi=https://api.example.com/openapi.json"
],
"enabled": true
}
}
}
```
Learn more about [configuring MCP servers in OpenCode](https://opencode.ai/docs/mcp-servers).
### Claude Desktop
Add to your Claude Desktop config (`~/Library/Application Support/Claude/claude_desktop_config.json` on macOS):
```json
{
"mcpServers": {
"openapi": {
"command": "npx",
"args": ["-y", "@abumalick/openapi-mcp"]
}
}
}
```
With pre-loaded specs:
```json
{
"mcpServers": {
"openapi": {
"command": "npx",
"args": [
"-y", "@abumalick/openapi-mcp",
"--spec", "petstore=https://petstore3.swagger.io/api/v3/openapi.yaml"
]
}
}
}
```
## Example Workflow
With pre-loaded specs, the LLM can immediately query them:
```
User: What specs are available?
Assistant: [calls openapi_list_specs] -> shows petstore is loaded
User: What endpoints are available for pets?
Assistant: [calls openapi_list_endpoints with alias="petstore", tag="pet"]
User: Show me details about the GET /pet/{petId} endpoint
Assistant: [calls openapi_get_endpoint with alias="petstore", method="GET", path="/pet/{petId}"]
```
Without pre-loading, first load the spec:
```
User: Load the Petstore API spec
Assistant: [calls openapi_load with source="https://petstore3.swagger.io/api/v3/openapi.yaml", alias="petstore"]
```
## Tools
### openapi_list_specs
List all currently loaded OpenAPI specs. Use this to see what's available.
```json
{}
```
Returns:
```json
{
"specs": [
{ "alias": "petstore", "title": "Swagger Petstore", "version": "1.0.0", "endpointCount": 19 }
]
}
```
### openapi_load
Load an OpenAPI spec from URL or file path. Only needed if spec is not pre-loaded.
```json
{
"source": "https://api.example.com/openapi.yaml",
"alias": "example"
}
```
### openapi_list_endpoints
List endpoints with optional filtering.
```json
{
"alias": "example",
"tag": "users",
"search": "create"
}
```
### openapi_get_endpoint
Get detailed endpoint information.
```json
{
"alias": "example",
"method": "POST",
"path": "/users"
}
```
## Supported OpenAPI Versions
- OpenAPI 3.0.x
- OpenAPI 3.1.x
Note: Swagger 2.0 specs should be converted to OpenAPI 3.x format first.
## Development
```bash
# Install dependencies
npm install
# Build
npm run build
# Run tests
npm test
# Development mode
npm run dev
```
## License
MIT