The MCP Server for Apache Airflow provides a standardized interface that allows MCP clients to interact with Apache Airflow through the Model Context Protocol. With this server, you can:
DAG Management: List, retrieve, pause, unpause, update, delete DAGs; manage DAG sources and reparse DAG files
DAG Runs: List, create, retrieve, update, delete, clear, and set notes for DAG runs; manage upstream dataset events
Tasks: Manage tasks and task instances including listing, retrieving, updating, clearing, and setting states
Resources: Create, list, update, and delete variables, connections, and pools
Data Exchange: List and retrieve XComs; manage datasets and dataset events
Monitoring: Check health status, retrieve DAG statistics, view configuration and system information
System Info: Access plugins, providers, event logs, import errors, and version information
Allows interaction with Apache Airflow through a Model Context Protocol server that wraps Airflow's REST API. Supports DAG management (listing, viewing details, pausing/unpausing), DAG runs (listing, creating), task management (listing tasks and task instances), and system information (health status, version, import errors).
mcp-server-apache-airflow
A Model Context Protocol (MCP) server implementation for Apache Airflow, enabling seamless integration with MCP clients. This project provides a standardized way to interact with Apache Airflow through the Model Context Protocol.
About
This project implements a Model Context Protocol server that wraps Apache Airflow's REST API, allowing MCP clients to interact with Airflow in a standardized way. It uses the official Apache Airflow client library to ensure compatibility and maintainability.
Feature Implementation Status
Feature | API Path | Status |
DAG Management | ||
List DAGs |
| ✅ |
Get DAG Details |
| ✅ |
Pause DAG |
| ✅ |
Unpause DAG |
| ✅ |
Update DAG |
| ✅ |
Delete DAG |
| ✅ |
Get DAG Source |
| ✅ |
Patch Multiple DAGs |
| ✅ |
Reparse DAG File |
| ✅ |
DAG Runs | ||
List DAG Runs |
| ✅ |
Create DAG Run |
| ✅ |
Get DAG Run Details |
| ✅ |
Update DAG Run |
| ✅ |
Delete DAG Run |
| ✅ |
Get DAG Runs Batch |
| ✅ |
Clear DAG Run |
| ✅ |
Set DAG Run Note |
| ✅ |
Get Upstream Dataset Events |
| ✅ |
Tasks | ||
List DAG Tasks |
| ✅ |
Get Task Details |
| ✅ |
Get Task Instance |
| ✅ |
List Task Instances |
| ✅ |
Update Task Instance |
| ✅ |
Clear Task Instances |
| ✅ |
Set Task Instances State |
| ✅ |
Variables | ||
List Variables |
| ✅ |
Create Variable |
| ✅ |
Get Variable |
| ✅ |
Update Variable |
| ✅ |
Delete Variable |
| ✅ |
Connections | ||
List Connections |
| ✅ |
Create Connection |
| ✅ |
Get Connection |
| ✅ |
Update Connection |
| ✅ |
Delete Connection |
| ✅ |
Test Connection |
| ✅ |
Pools | ||
List Pools |
| ✅ |
Create Pool |
| ✅ |
Get Pool |
| ✅ |
Update Pool |
| ✅ |
Delete Pool |
| ✅ |
XComs | ||
List XComs |
| ✅ |
Get XCom Entry |
| ✅ |
Datasets | ||
List Datasets |
| ✅ |
Get Dataset |
| ✅ |
Get Dataset Events |
| ✅ |
Create Dataset Event |
| ✅ |
Get DAG Dataset Queued Event |
| ✅ |
Get DAG Dataset Queued Events |
| ✅ |
Delete DAG Dataset Queued Event |
| ✅ |
Delete DAG Dataset Queued Events |
| ✅ |
Get Dataset Queued Events |
| ✅ |
Delete Dataset Queued Events |
| ✅ |
Monitoring | ||
Get Health |
| ✅ |
DAG Stats | ||
Get DAG Stats |
| ✅ |
Config | ||
Get Config |
| ✅ |
Plugins | ||
Get Plugins |
| ✅ |
Providers | ||
List Providers |
| ✅ |
Event Logs | ||
List Event Logs |
| ✅ |
Get Event Log |
| ✅ |
System | ||
Get Import Errors |
| ✅ |
Get Import Error Details |
| ✅ |
Get Health Status |
| ✅ |
Get Version |
| ✅ |
Setup
Dependencies
This project depends on the official Apache Airflow client library (apache-airflow-client
). It will be automatically installed when you install this package.
Environment Variables
Set the following environment variables:
Usage with Claude Desktop
Add to your claude_desktop_config.json
:
Alternative configuration using uv
:
Replace /path/to/mcp-server-apache-airflow
with the actual path where you've cloned the repository.
Selecting the API groups
You can select the API groups you want to use by setting the --apis
flag.
The default is to use all APIs.
Allowed values are:
config
connections
dag
dagrun
dagstats
dataset
eventlog
importerror
monitoring
plugin
pool
provider
taskinstance
variable
xcom
Manual Execution
You can also run the server manually:
make run
accepts following options:
Options:
--port
: Port to listen on for SSE (default: 8000)--transport
: Transport type (stdio/sse, default: stdio)
Or, you could run the sse server directly, which accepts same parameters:
Installing via Smithery
To install Apache Airflow MCP Server for Claude Desktop automatically via Smithery:
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Tools
Provides a standardized way for MCP clients to interact with Apache Airflow's REST API, supporting operations like DAG management and monitoring Airflow system health.
Related Resources
Related MCP Servers
- AsecurityAlicenseAqualityhttps://github.com/abhishekbhakat/airflow-mcp-serverLast updated -22MIT License
- -securityAlicense-qualityProvides integration with Apache Airflow's REST API, allowing AI assistants to programmatically interact with Airflow workflows, monitor DAG runs, and manage tasks.Last updated -MIT License
- AsecurityAlicenseAqualityAn MCP server using the AviationStack API to fetch real-time flight data, including airline flights, airport schedules, future flights and aircraft types.Last updated -79MIT License
- AsecurityAlicenseAqualityMonitor and manage Apache Airflow clusters through natural language queries via MCP tools: DAG inspection, task monitoring, health checks, and cluster analytics without API complexity. * Guide: https://call518.medium.com/mcp-airflow-api-a-model-context-protocol-mcp-server-for-apache-airflow-5dfdfb2Last updated -3441MIT License