Provides comprehensive access to Apache Airflow's REST API, enabling capabilities such as DAG management, task monitoring, log retrieval, and system health diagnostics.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@astro-airflow-mcpdiagnose why the production_etl DAG failed and show the logs"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Airflow MCP Server
A Model Context Protocol (MCP) server for Apache Airflow that provides AI assistants with access to Airflow's REST API. Built with FastMCP.
Quickstart
IDEs
Add to your MCP settings (Cursor: ~/.cursor/mcp.json, VS Code: .vscode/mcp.json):
CLI Tools
Desktop Apps
Add to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):
Other MCP Clients
Add to your MCP configuration file:
Or connect to a running HTTP server: "url": "http://localhost:8000/mcp"
Note: No installation required -
uvxruns directly from PyPI. The--transport stdioflag is required because the server defaults to HTTP mode.
Configuration
By default, the server connects to http://localhost:8080 (Astro CLI default). Set environment variables for custom Airflow instances:
Variable | Description |
| Airflow webserver URL |
| Username (Airflow 3.x uses OAuth2 token exchange) |
| Password |
| Bearer token (alternative to username/password) |
Example with auth (Claude Code):
Features
Airflow 2.x and 3.x Support: Automatic version detection with adapter pattern
MCP Tools for accessing Airflow data:
DAG management (list, get details, get source code, stats, warnings, import errors, trigger, pause/unpause)
Task management (list, get details, get task instances, get logs)
Pool management (list, get details)
Variable management (list, get specific variables)
Connection management (list connections with credentials excluded)
Asset/Dataset management (unified naming across versions, data lineage)
Plugin and provider information
Configuration and version details
Consolidated Tools for agent workflows:
explore_dag: Get comprehensive DAG information in one calldiagnose_dag_run: Debug failed DAG runs with task instance detailsget_system_health: System overview with health, errors, and warnings
MCP Resources: Static Airflow info exposed as resources (version, providers, plugins, config)
MCP Prompts: Guided workflows for common tasks (troubleshooting, health checks, onboarding)
Dual deployment modes:
Standalone server: Run as an independent MCP server
Airflow plugin: Integrate directly into Airflow 3.x webserver
Flexible Authentication:
Bearer token (Airflow 2.x and 3.x)
Username/password with automatic OAuth2 token exchange (Airflow 3.x)
Basic auth (Airflow 2.x)
Available Tools
Consolidated Tools (Agent-Optimized)
Tool | Description |
| Get comprehensive DAG info: metadata, tasks, recent runs, source code |
| Debug a DAG run: run details, failed task instances, logs |
| System overview: health status, import errors, warnings, DAG stats |
Core Tools
Tool | Description |
| Get all DAGs and their metadata |
| Get detailed info about a specific DAG |
| Get the source code of a DAG |
| Get DAG run statistics (Airflow 3.x only) |
| Get DAG import warnings |
| Get import errors from DAG files that failed to parse |
| Get DAG run history |
| Get specific DAG run details |
| Trigger a new DAG run (start a workflow execution) |
| Pause a DAG to prevent new scheduled runs |
| Unpause a DAG to resume scheduled runs |
| Get all tasks in a DAG |
| Get details about a specific task |
| Get task instance execution details |
| Get logs for a specific task instance execution |
| Get all resource pools |
| Get details about a specific pool |
| Get all Airflow variables |
| Get a specific variable by key |
| Get all connections (credentials excluded for security) |
| Get assets/datasets (unified naming across versions) |
| Get installed Airflow plugins |
| Get installed provider packages |
| Get Airflow configuration |
| Get Airflow version information |
MCP Resources
Resource URI | Description |
| Airflow version information |
| Installed provider packages |
| Installed Airflow plugins |
| Airflow configuration |
MCP Prompts
Prompt | Description |
| Guided workflow for diagnosing DAG failures |
| Morning health check routine |
| Guide for understanding a new DAG |
Advanced Usage
Running as Standalone Server
For HTTP-based integrations or connecting multiple clients to one server:
Connect MCP clients to: http://localhost:8000/mcp
Airflow Plugin Mode
Install into your Airflow 3.x environment to expose MCP at http://your-airflow:8080/mcp/v1:
CLI Options
Flag | Environment Variable | Default | Description |
|
|
| Transport mode ( |
|
|
| Host to bind to (HTTP mode only) |
|
|
| Port to bind to (HTTP mode only) |
|
| Auto-discovered or | Airflow webserver URL |
|
|
| Astro project directory for auto-discovering Airflow URL from |
|
|
| Bearer token for authentication |
|
|
| Username for authentication (Airflow 3.x uses OAuth2 token exchange) |
|
|
| Password for authentication |
Architecture
The server is built using FastMCP with an adapter pattern for Airflow version compatibility:
Core Components
Adapters (
adapters/): Version-specific API implementationsAirflowAdapter(base): Abstract interface for all Airflow API operationsAirflowV2Adapter: Airflow 2.x API (/api/v1) with basic authAirflowV3Adapter: Airflow 3.x API (/api/v2) with OAuth2 token exchange
Version Detection: Automatic detection at startup by probing API endpoints
Models (
models.py): Pydantic models for type-safe API responses
Version Handling Strategy
Major versions (2.x vs 3.x): Adapter pattern with runtime version detection
Minor versions (3.1 vs 3.2): Runtime feature detection with graceful fallbacks
New API parameters: Pass-through
**kwargsfor forward compatibility
Deployment Modes
Standalone: Independent ASGI application with HTTP/SSE transport
Plugin: Mounted into Airflow 3.x FastAPI webserver
Development
Contributing
Contributions welcome! Please ensure:
All tests pass (
make test)Code passes linting (
make check)prek hooks pass (
make prek)