The airflow-mcp-server is a Model Context Protocol server that provides comprehensive control over Apache Airflow instances via its API. It operates in either read-only (safe) mode or read-write (unsafe) mode.
Capabilities include:
DAG Management: List, create, update, delete, parse, and monitor DAGs and DAG runs
Task Management: Control task instances, view dependencies, update states, and retrieve logs
Resource Management: Create and manage connections, pools, variables, XCom entries
User Administration: Handle users, roles, and permissions
Dataset Operations: Manage datasets and related events
System Information: Retrieve health status, configuration, version details, and providers
The server supports JWT authentication and automatically parses the OpenAPI specification from the Airflow instance.
airflow-mcp-server: An MCP Server for controlling Airflow 3
mcp-name: io.github.abhishekbhakat/airflow-mcp-server
MCPHub Certification
This MCP server is certified by MCPHub. This certification ensures that airflow-mcp-server follows best practices for Model Context Protocol implementation.
Find on Glama
Overview
A Model Context Protocol server for controlling Airflow via Airflow APIs.
Related MCP server: Airthings Consumer MCP Server
Demo Video
https://github.com/user-attachments/assets/f3e60fff-8680-4dd9-b08e-fa7db655a705
Setup
Usage with Claude Desktop
Stdio Transport (Default)
See CONFIG.md for IDE-specific configuration examples across popular MCP clients.
HTTP Transport
Note:
Set
base_urlto the root Airflow URL (e.g.,http://localhost:8080).Do not include
/api/v2in the base URL. The server will automatically fetch the OpenAPI spec from${base_url}/openapi.json.Only JWT token is required for authentication. Cookie and basic auth are no longer supported in Airflow 3.0.
Transport Options
The server supports multiple transport protocols:
Stdio Transport (Default)
Standard input/output transport for direct process communication:
HTTP Transport
Uses Streamable HTTP for better scalability and web compatibility:
Note: SSE transport is deprecated. Use
--httpfor new deployments as it provides better bidirectional communication and is the recommended approach by FastMCP.
Operation Modes
The server supports two operation modes:
Safe Mode (
--safe): Only allows read-only operations (GET requests). This is useful when you want to prevent any modifications to your Airflow instance.Unsafe Mode (
--unsafe): Allows all operations including modifications. This is the default mode.
To start in safe mode:
To explicitly start in unsafe mode (though this is default):
Tool Discovery Modes
The server supports two tool discovery approaches:
Hierarchical Discovery (default): Tools are organized by categories (DAGs, Tasks, Connections, etc.). Browse categories first, then select specific tools. More manageable for large APIs.
Static Tools (
--static-tools): All tools available immediately. Better for programmatic access but can be overwhelming.
To use static tools:
Command Line Options
Using Resources
Point the server at a folder of Markdown guides whenever you want agents to reference local documentation:
Every top-level
.md/.markdownfile becomes a read-only resource (file:///<slug>) visible in your MCP client.The first
# Headingin each file (if present) is used as the resource title; otherwise the filename stem is used.Set
AIRFLOW_MCP_RESOURCES_DIR=/path/to/docsif you prefer environment-based configuration.Update the files on disk and restart the server to refresh the resources list.
Considerations
Authentication
Only JWT authentication is supported in Airflow 3.0. You must provide a valid
AUTH_TOKEN.
Page Limit
The default is 100 items, but you can change it using maximum_page_limit option in [api] section in the airflow.cfg file.
Transport Selection
Use stdio transport for direct process communication (default)
Use HTTP transport for web deployments, multiple clients, or when you need better scalability
Avoid SSE transport as it's deprecated in favor of HTTP transport