List Apache Airflow environments in AWS to monitor and manage your data workflows. Retrieve environment details using AWS profiles and region settings.
Submit data processing jobs to Google Cloud Dataproc clusters for Spark, PySpark, Hive, and Hadoop workloads using specified project, region, and cluster parameters.
Provides best practices guidance for designing, optimizing, and securing Apache Airflow workflows on Amazon MWAA, covering DAG patterns, performance, resource management, error handling, and security.
Retrieve detailed user information including profile, permissions, and authentication sources from Apache Ambari Hadoop clusters to manage access and monitor user activities.
Create dynamic, interactive charts using Apache ECharts by providing customizable configurations. Export visualizations as PNG, SVG, or raw option formats for seamless integration into web applications.