Skip to main content
Glama
yangkyeongmo

MCP Server for Apache Airflow

by yangkyeongmo

post_dag_run

Trigger a specific DAG in Apache Airflow by providing its ID, enabling targeted workflow execution with optional parameters like run ID, data intervals, and execution dates.

Instructions

Trigger a DAG by ID

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
dag_idYes
dag_run_idNo
data_interval_endNo
data_interval_startNo
execution_dateNo
logical_dateNo
noteNo

Implementation Reference

  • The main handler function for the 'post_dag_run' tool. It constructs a DAGRun object from the provided parameters (filtering None values), calls the Airflow DAGRunApi to post the DAG run, and returns the response as text content.
    async def post_dag_run( dag_id: str, dag_run_id: Optional[str] = None, data_interval_end: Optional[datetime] = None, data_interval_start: Optional[datetime] = None, execution_date: Optional[datetime] = None, logical_date: Optional[datetime] = None, note: Optional[str] = None, # state: Optional[str] = None, # TODO: add state ) -> List[Union[types.TextContent, types.ImageContent, types.EmbeddedResource]]: # Build kwargs dictionary with only non-None values kwargs = {} # Add non-read-only fields that can be set during creation if dag_run_id is not None: kwargs["dag_run_id"] = dag_run_id if data_interval_end is not None: kwargs["data_interval_end"] = data_interval_end if data_interval_start is not None: kwargs["data_interval_start"] = data_interval_start if execution_date is not None: kwargs["execution_date"] = execution_date if logical_date is not None: kwargs["logical_date"] = logical_date if note is not None: kwargs["note"] = note # Create DAGRun without read-only fields dag_run = DAGRun(**kwargs) response = dag_run_api.post_dag_run(dag_id=dag_id, dag_run=dag_run) return [types.TextContent(type="text", text=str(response.to_dict()))]
  • Module-level registration function that returns the list of all DAG run related tools, including ('post_dag_run', 'post_dag_run', 'Trigger a DAG by ID', False). This is imported and called in src/main.py to register the tools with the MCP server.
    def get_all_functions() -> list[tuple[Callable, str, str, bool]]: """Return list of (function, name, description, is_read_only) tuples for registration.""" return [ (post_dag_run, "post_dag_run", "Trigger a DAG by ID", False), (get_dag_runs, "get_dag_runs", "Get DAG runs by ID", True), (get_dag_runs_batch, "get_dag_runs_batch", "List DAG runs (batch)", True), (get_dag_run, "get_dag_run", "Get a DAG run by DAG ID and DAG run ID", True), (update_dag_run_state, "update_dag_run_state", "Update a DAG run state by DAG ID and DAG run ID", False), (delete_dag_run, "delete_dag_run", "Delete a DAG run by DAG ID and DAG run ID", False), (clear_dag_run, "clear_dag_run", "Clear a DAG run", False), (set_dag_run_note, "set_dag_run_note", "Update the DagRun note", False), (get_upstream_dataset_events, "get_upstream_dataset_events", "Get dataset events for a DAG run", True), ]
  • src/main.py:83-97 (registration)
    Central registration logic in the main entrypoint. Imports get_all_functions from src.airflow.dagrun (line 9, not shown), maps it via APITYPE_TO_FUNCTIONS, calls it to get the tool list, and registers each tool with app.add_tool using the function, name, and description.
    for api in apis: logging.debug(f"Adding API: {api}") get_function = APITYPE_TO_FUNCTIONS[APIType(api)] try: functions = get_function() except NotImplementedError: continue # Filter functions for read-only mode if requested if read_only: functions = filter_functions_for_read_only(functions) for func, name, description, *_ in functions: app.add_tool(Tool.from_function(func, name=name, description=description))

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/yangkyeongmo/mcp-server-apache-airflow'

If you have feedback or need assistance with the MCP directory API, please join our Discord server