Integrates with Amazon MWAA (Managed Workflows for Apache Airflow), enabling management and monitoring of Airflow workflows hosted on AWS's managed Airflow service.
Enables interaction with Apache Airflow's REST API, allowing management of DAGs (listing, viewing, pausing, unpausing), triggering and monitoring DAG runs, accessing task instance details, and retrieving comprehensive logs for debugging and troubleshooting.
Supports deployment to DigitalOcean App Platform, with configuration options for connecting to Airflow instances hosted anywhere.
Provides integration with Google Cloud Composer environments, allowing management and monitoring of Apache Airflow workflows hosted on Google Cloud's managed Airflow service.
MCP Server for Apache Airflow
A Model Context Protocol (MCP) server that provides comprehensive integration with Apache Airflow's REST API. This server allows AI assistants to interact with Airflow workflows, monitor DAG runs, and manage tasks programmatically.
Features
- DAG Management: List, view details, pause, and unpause DAGs
- DAG Run Operations: Trigger new runs, list existing runs, and get detailed run information
- Task Instance Monitoring: View task instances and their execution details
- Universal Compatibility: Works with all popular Airflow hosting platforms:
- Astronomer
- Google Cloud Composer
- Amazon MWAA
- Self-hosted Airflow instances
- Comprehensive Logging: Access and monitor logs for debugging and troubleshooting:
- Real-time log retrieval for individual tasks
- Aggregate logs for entire DAG runs
- Smart log tailing with recent activity summaries
- Automatic log formatting and decoding
Available Tools
DAG Management
- airflow_list_dags - List all DAGs with pagination and sorting
- airflow_get_dag - Get detailed information about a specific DAG
- airflow_trigger_dag - Trigger a new DAG run with optional configuration
- airflow_pause_dag - Pause a DAG
- airflow_unpause_dag - Unpause a DAG
DAG Run Monitoring
- airflow_list_dag_runs - List DAG runs for a specific DAG
- airflow_get_dag_run - Get details of a specific DAG run
- airflow_list_task_instances - List task instances for a DAG run
- airflow_get_task_instance - Get detailed task instance information
Logging & Debugging
- airflow_get_task_logs - Get complete logs for a specific task instance
- airflow_get_dag_run_logs - Get logs for all tasks in a DAG run
- airflow_tail_dag_run - Tail/monitor a DAG run with recent activity and logs
Installation & Deployment
Local Development
Via NPX (Recommended for Claude Desktop)
HTTP Server (Recommended for Cloud Deployment)
From Source
Cloud Deployment (Recommended)
This server supports streamable HTTP transport, which is the current best practice for MCP servers. Deploy to your preferred cloud platform:
Quick Deploy
This interactive script will guide you through deploying to:
- Google Cloud Platform (Cloud Run)
- Amazon Web Services (ECS Fargate)
- DigitalOcean App Platform
- Netlify (Serverless Functions)
Manual Deployment Options
- Fork this repository to your GitHub account
- Create a new app in DigitalOcean App Platform
- Connect your forked repository
- Use the provided app spec:
deploy/digitalocean-app.yaml
- Set environment variables in the dashboard:
AIRFLOW_BASE_URL
AIRFLOW_TOKEN
(orAIRFLOW_USERNAME
andAIRFLOW_PASSWORD
)
Netlify offers excellent serverless deployment with built-in CI/CD and global CDN.
Quick Deploy
Manual Deployment
Environment Variables
Option 1: Using Netlify CLI (Recommended)
Option 2: Netlify Dashboard
Set these in your Netlify site dashboard (Site settings → Environment variables):
AIRFLOW_BASE_URL
: Your Airflow instance URLAIRFLOW_TOKEN
: Your Airflow API token (recommended)
Or for basic auth:
AIRFLOW_USERNAME
: Your Airflow usernameAIRFLOW_PASSWORD
: Your Airflow password
Local Development
Your MCP server will be available at http://localhost:8888/.netlify/functions/mcp
Docker Deployment
Configuration
The server requires authentication configuration through environment variables:
Option 1: API Token (Recommended)
Option 2: Basic Authentication
Environment Variables
Variable | Required | Description |
---|---|---|
AIRFLOW_BASE_URL | Yes | Base URL of your Airflow instance |
AIRFLOW_TOKEN | No* | API token for authentication |
AIRFLOW_USERNAME | No* | Username for basic auth |
AIRFLOW_PASSWORD | No* | Password for basic auth |
*Either AIRFLOW_TOKEN
or both AIRFLOW_USERNAME
and AIRFLOW_PASSWORD
must be provided.
Platform-Specific Setup
Astronomer
Google Cloud Composer
Amazon MWAA
Testing
Local Testing
Test both stdio and HTTP modes:
HTTP API Testing
Once deployed, test your HTTP endpoint:
Claude Desktop Integration
Stdio Mode (Local Development)
Add this to your Claude Desktop MCP settings:
HTTP Mode (Cloud Deployment)
For streamable HTTP transport, configure Claude to use your deployed endpoint:
Platform-specific endpoints:
- Netlify:
https://your-site.netlify.app/mcp
- Google Cloud Run:
https://your-service-url.run.app/
- AWS/DigitalOcean:
https://your-deployed-url/
Usage Examples
Once connected, you can use natural language to interact with Airflow:
DAG Management
- "List all my DAGs"
- "Show me the details of the data_pipeline DAG"
- "Trigger the daily_etl DAG with custom configuration"
- "Pause the problematic_dag DAG"
Monitoring & Status
- "What's the status of the latest run for my_workflow?"
- "Show me all failed task instances from the last run"
- "List all DAG runs for my_data_pipeline from today"
Logging & Debugging
- "Show me the logs for the extract_data task in run daily_etl_2024_01_15"
- "Get all logs for the failed DAG run daily_etl_2024_01_15"
- "Tail the current DAG run and show me what's happening"
- "Show me the recent activity for the running data_pipeline"
Advanced Examples
- "Get logs for task 'transform_data' in DAG 'etl_pipeline' run 'manual_2024_01_15', try number 2"
- "Monitor the DAG run 'scheduled_2024_01_15' and show the last 100 log lines for each task"
- "Show me logs for the first 5 tasks in the failed DAG run"
Authentication Requirements
This server uses Airflow's stable REST API (v1), which requires authentication. The API supports:
- Bearer Token Authentication: Most secure, recommended for production
- Basic Authentication: Username/password, useful for development
- Session Authentication: Handled automatically when using web-based tokens
Security Considerations
- Store credentials securely and never commit them to version control
- Use environment variables or secure secret management systems
- For production deployments, prefer API tokens over username/password
- Ensure your Airflow instance has proper network security (TLS, VPC, etc.)
- Apply appropriate rate limiting and monitoring
- Use HTTPS endpoints for production deployments
- Implement proper authentication and authorization at the load balancer/gateway level
Performance & Scaling
HTTP Mode Benefits
- Stateless: Each request is independent, allowing horizontal scaling
- Caching: Responses can be cached at the CDN/proxy level
- Load Balancing: Multiple instances can handle requests
- Monitoring: Standard HTTP monitoring tools work out of the box
- Debugging: Easy to test and debug with standard HTTP tools
Recommended Production Setup
- Auto-scaling: Configure your cloud platform to scale based on CPU/memory usage
- Health Checks: Use the
/health
endpoint for load balancer health checks - Monitoring: Set up logging and metrics collection
- Caching: Consider caching frequently accessed DAG information
- Rate Limiting: Implement rate limiting to protect your Airflow instance
API Compatibility
This server is compatible with Apache Airflow 2.x REST API. It has been tested with:
- Apache Airflow 2.7+
- Astronomer Software and Cloud
- Google Cloud Composer 2
- Amazon MWAA (all supported Airflow versions)
Development
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
MIT License - see LICENSE file for details.
Related Projects
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Provides integration with Apache Airflow's REST API, allowing AI assistants to programmatically interact with Airflow workflows, monitor DAG runs, and manage tasks.
Related MCP Servers
- AsecurityFlicenseAqualityProvides integration with Jira's REST API, allowing AI assistants to manage Jira issues programmatically.Last updated -66JavaScript
- -securityAlicense-qualityEnables AI assistants to interact with WordPress sites through the WordPress REST API. Supports multiple WordPress sites with secure authentication, enabling content management, post operations, and site configuration through natural language.Last updated -1831MIT License
- AsecurityAlicenseAqualityProvides a standardized way for MCP clients to interact with Apache Airflow's REST API, supporting operations like DAG management and monitoring Airflow system health.Last updated -1343PythonMIT License
- -securityAlicense-qualityEnables AI assistants to interact with WordPress sites through the REST API. Supports multiple WordPress sites with secure authentication, enabling content management, post operations, and site configuration through natural language.Last updated -18MIT License