Skip to main content
Glama

FastMCP OpenAPI

A FastMCP wrapper that dynamically generates MCP (Model Context Protocol) tools from OpenAPI specifications.

Quick Start

Prerequisites

  • Python 3.8+ with pip

  • Node.js 16+ (for MCP Inspector)

  • OpenAI API key (for LangChain demos)

Installation

pip install fastmcp-openapi

Basic Usage

# Generate MCP tools from any OpenAPI spec fastmcp-openapi --spec https://petstore.swagger.io/v2/swagger.json # With authentication fastmcp-openapi --spec https://api.example.com/openapi.json --auth-header "Bearer your-token" # Multiple APIs fastmcp-openapi --spec api1.json --spec api2.json --spec api3.json

Test with MCP Inspector

# Install MCP Inspector npm install -g @modelcontextprotocol/inspector # Test your OpenAPI tools npx @modelcontextprotocol/inspector fastmcp-openapi --spec examples/simple_api.json # Test Petstore API with Inspector npx @modelcontextprotocol/inspector fastmcp-openapi --spec https://petstore.swagger.io/v2/swagger.json --base-url https://petstore.swagger.io/v2

Claude Desktop Integration

Add to your Claude Desktop config:

{ "mcpServers": { "openapi-server": { "command": "fastmcp-openapi", "args": ["--spec", "https://api.example.com/openapi.json", "--auth-header", "Bearer your-token"] } } }

Features

  • Dynamic Tool Generation: Converts OpenAPI operations to MCP tools automatically

  • Type Safety: Full parameter validation using OpenAPI schemas

  • Authentication: Bearer tokens, API keys, Basic auth

  • Multiple APIs: Load multiple OpenAPI specs in one server

  • Real-time: Add/remove APIs without restart

Command Line Options

fastmcp-openapi --help Options: --spec TEXT OpenAPI specification URL or file path (can be used multiple times) --name TEXT Server name (default: "OpenAPI Server") --auth-header TEXT Authorization header (e.g., 'Bearer token123'). Must match order of --spec options. --base-url TEXT Override base URL for API calls. Must match order of --spec options. --config TEXT JSON config file with API specifications --transport TEXT Transport: stdio, streamable-http, sse (default: stdio) --port INTEGER Port for HTTP/SSE transport (default: 8000) --debug Enable debug logging

Programmatic Usage

from fastmcp_openapi import OpenAPIServer # Create server server = OpenAPIServer("My API Server") # Add OpenAPI specs await server.add_openapi_spec( name="petstore", spec_url="https://petstore.swagger.io/v2/swagger.json", auth_header="Bearer your-token" ) # Run server server.run()

Examples

Multiple APIs with Different Auth

# Multiple APIs with different base URLs and auth fastmcp-openapi \ --spec https://petstore.swagger.io/v2/swagger.json \ --spec https://api.github.com/openapi.yaml \ --spec ./local-api.json \ --base-url https://petstore.swagger.io/v2 \ --base-url https://api.github.com \ --base-url http://localhost:3000 \ --auth-header "Bearer petstore-token" \ --auth-header "Bearer github-token" \ --auth-header "Basic local-auth" # Each API gets its own tools with prefixes: # - api_1_getPetById (Petstore) # - api_2_getUser (GitHub) # - api_3_createItem (Local API)

Mixed API Sources

# Combine remote and local APIs fastmcp-openapi \ --spec https://petstore.swagger.io/v2/swagger.json \ --spec examples/simple_api.json \ --spec https://jsonplaceholder.typicode.com/openapi.json \ --base-url https://petstore.swagger.io/v2 \ --base-url http://localhost:8080 \ --base-url https://jsonplaceholder.typicode.com # Creates unified MCP server with tools from all APIs

Authenticated API

fastmcp-openapi \ --spec https://api.example.com/openapi.json \ --auth-header "Bearer your-oauth-token" \ --base-url "https://api.example.com/v1"

Development Mode

# HTTP mode for web testing fastmcp-openapi \ --spec examples/simple_api.json \ --transport streamable-http \ --port 8080 \ --debug # SSE mode for MCP Inspector fastmcp-openapi \ --spec https://petstore.swagger.io/v2/swagger.json \ --base-url https://petstore.swagger.io/v2 \ --transport sse \ --port 8081 \ --debug

LangChain Integration

# Install required dependencies pip install langchain-openai langchain-mcp-adapters langgraph # Set OpenAI API key export OPENAI_API_KEY="your-openai-api-key" # Start FastMCP server with HTTP transport fastmcp-openapi \ --spec https://petstore.swagger.io/v2/swagger.json \ --base-url https://petstore.swagger.io/v2 \ --transport streamable-http \ --port 8081 # Run LangChain test (in another terminal) python test_mcp_langchain.py

The LangChain integration allows AI agents to use the generated MCP tools for natural language interaction with APIs.

How It Works

  1. Load OpenAPI Spec: Fetches and parses OpenAPI/Swagger specifications

  2. Generate Tools: Creates MCP tools for each API operation with proper schemas

  3. Handle Requests: Validates parameters and makes authenticated HTTP requests

  4. Return Results: Formats API responses for AI consumption

Supported Features

  • ✅ OpenAPI 3.0.x, 3.1.x, Swagger 2.0

  • ✅ Path/query parameters, headers, request bodies

  • ✅ Authentication (Bearer, API Key, Basic)

  • ✅ Parameter validation and type checking

  • ✅ Multiple APIs in one server

  • ✅ Multiple transports: stdio, streamable-http, sse

  • ✅ LangChain integration for AI agents

  • ✅ MCP Inspector support for interactive testing

Testing & Examples

Quick Test with Petstore API

# 1. Start server with SSE transport fastmcp-openapi --spec https://petstore.swagger.io/v2/swagger.json --base-url https://petstore.swagger.io/v2 --transport sse --port 8081 # 2. Test with MCP Inspector (in another terminal) npx @modelcontextprotocol/inspector fastmcp-openapi --spec https://petstore.swagger.io/v2/swagger.json --base-url https://petstore.swagger.io/v2 # 3. Test with LangChain (requires OPENAI_API_KEY) python test_mcp_langchain.py

Available Transport Modes

  • stdio: Standard input/output (default, for Claude Desktop)

  • streamable-http: HTTP-based transport (for LangChain integration)

  • sse: Server-Sent Events transport (for MCP Inspector)

Multiple API Management

FastMCP OpenAPI supports combining multiple OpenAPI specifications into a single MCP server, each with their own base URLs and authentication.

Configuration Methods

Create a JSON config file to clearly define each API:

{ "apis": [ { "name": "petstore", "spec": "https://petstore.swagger.io/v2/swagger.json", "base_url": "https://petstore.swagger.io/v2", "auth": "Bearer petstore-api-key" }, { "name": "simple_api", "spec": "examples/simple_api.json", "base_url": "http://localhost:8080" } ] }
# Use the config file fastmcp-openapi --config examples/multi_api_config.json --transport sse --port 8081

Method 2: Command Line Arguments (Positional Matching)

⚠️ Important: Arguments must be in the same order - each --base-url and --auth-header matches the corresponding --spec by position.

# Order matters: spec[0]→base_url[0]→auth[0], spec[1]→base_url[1]→auth[1], etc. fastmcp-openapi \ --spec https://petstore.swagger.io/v2/swagger.json \ # Position 0 --spec examples/simple_api.json \ # Position 1 --spec https://api.github.com/openapi.yaml \ # Position 2 --base-url https://petstore.swagger.io/v2 \ # Position 0 → spec[0] --base-url http://localhost:8080 \ # Position 1 → spec[1] --base-url https://api.github.com \ # Position 2 → spec[2] --auth-header "Bearer petstore-key" \ # Position 0 → spec[0] --auth-header "" \ # Position 1 → spec[1] (no auth) --auth-header "Bearer github-key" # Position 2 → spec[2]

Benefits

  • Unified Interface: Access multiple APIs through one MCP server

  • Individual Configuration: Each API can have its own base URL and auth

  • Tool Namespacing: Tools are automatically prefixed to avoid conflicts

  • Mixed Sources: Combine remote APIs, local services, and files

Tool Naming Convention

When multiple APIs are loaded, tools are automatically prefixed:

  • Single API: operationIdgetPetById

  • Multiple APIs: api_name_operationIdpetstore_getPetById, github_getUser

Development

git clone <repository> cd fastmcp-openapi pip install -e ".[dev]" pytest

License

MIT License

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Agarwal-Saurabh/multiapimcpserver'

If you have feedback or need assistance with the MCP directory API, please join our Discord server