running_dags
Monitor and retrieve detailed information on all currently running DAGs in an Apache Airflow cluster using this tool, providing essential insights for workflow management and optimization.
Instructions
[Tool Role]: Lists all currently running DAG runs in the Airflow cluster.
Returns: List of running DAG runs with comprehensive info: dag_id, run_id, state, execution_date, start_date, end_date, data_interval_start, data_interval_end, run_type, conf, external_trigger, dag_display_name
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
No arguments | |||
Implementation Reference
- Core implementation of the 'running_dags' MCP tool handler. Queries Airflow API v1/v2 for running DAG runs (state=running), processes the response, and returns structured data with run details and summary statistics.@mcp.tool() async def running_dags() -> Dict[str, Any]: """[Tool Role]: Lists all currently running DAG runs in the Airflow cluster.""" resp = await airflow_request("GET", "/dags/~/dagRuns?state=running&limit=1000&order_by=-start_date") resp.raise_for_status() data = resp.json() running_runs = [] for run in data.get("dag_runs", []): run_info = { "dag_id": run.get("dag_id"), "dag_display_name": run.get("dag_display_name"), "run_id": run.get("run_id"), "run_type": run.get("run_type"), "state": run.get("state"), "execution_date": run.get("execution_date"), "start_date": run.get("start_date"), "end_date": run.get("end_date"), "data_interval_start": run.get("data_interval_start"), "data_interval_end": run.get("data_interval_end"), "external_trigger": run.get("external_trigger"), "conf": run.get("conf"), "note": run.get("note") } running_runs.append(run_info) return { "dag_runs": running_runs, "total_running": len(running_runs), "query_info": { "state_filter": "running", "limit": 1000, "order_by": "start_date (descending)" } }
- src/mcp_airflow_api/tools/v1_tools.py:13-27 (registration)Registration entry point for Airflow API v1. Sets v1-specific airflow_request function and calls register_common_tools(mcp) which registers the running_dags tool via @mcp.tool() decorator.def register_tools(mcp): """Register v1 tools by importing common tools with v1 request function.""" logger.info("Initializing MCP server for Airflow API v1") logger.info("Loading Airflow API v1 tools (Airflow 2.x)") # Set the global request function to v1 common_tools.airflow_request = airflow_request_v1 # Register all 56 common tools (includes management tools) common_tools.register_common_tools(mcp) # V1 has no exclusive tools - all tools are shared with v2 logger.info("Registered all Airflow API v1 tools (56 tools: 43 core + 13 management tools)")
- src/mcp_airflow_api/tools/v2_tools.py:21-24 (registration)Within v2 register_tools function: sets v2-specific airflow_request and calls register_common_tools(mcp), registering the running_dags tool for API v2.common_tools.airflow_request = airflow_request_v2 # Register all 43 common tools common_tools.register_common_tools(mcp)