update_environment
Modify configuration parameters for an existing Amazon Managed Workflows for Apache Airflow (MWAA) environment, including DAG paths, scaling settings, and Airflow versions.
Instructions
Update an existing MWAA environment configuration.
Only provide the parameters you want to change.
Args: name: Environment name dag_s3_path: S3 path to DAGs folder execution_role_arn: IAM role ARN network_configuration: VPC configuration source_bucket_arn: S3 bucket ARN airflow_version: Apache Airflow version environment_class: Environment size max_workers: Maximum workers min_workers: Minimum workers schedulers: Number of schedulers webserver_access_mode: Access mode weekly_maintenance_window_start: Maintenance window airflow_configuration_options: Configuration overrides logging_configuration: Logging settings requirements_s3_path: Path to requirements.txt plugins_s3_path: Path to plugins.zip startup_script_s3_path: Path to startup script
Returns: Dictionary containing the environment ARN
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| name | Yes | ||
| dag_s3_path | No | ||
| execution_role_arn | No | ||
| network_configuration | No | ||
| source_bucket_arn | No | ||
| airflow_version | No | ||
| environment_class | No | ||
| max_workers | No | ||
| min_workers | No | ||
| schedulers | No | ||
| webserver_access_mode | No | ||
| weekly_maintenance_window_start | No | ||
| airflow_configuration_options | No | ||
| logging_configuration | No | ||
| requirements_s3_path | No | ||
| plugins_s3_path | No | ||
| startup_script_s3_path | No |
Implementation Reference
- awslabs/mwaa_mcp_server/tools.py:173-206 (handler)The actual implementation of the update_environment tool, which handles the boto3 client call to update the MWAA environment.
async def update_environment(self, **kwargs: Any) -> Dict[str, Any]: """Update an existing MWAA environment.""" self._check_readonly("update_environment") try: params = {k: v for k, v in kwargs.items() if v is not None} boto_params: Dict[str, Any] = {} param_mapping = { "name": "Name", "dag_s3_path": "DagS3Path", "execution_role_arn": "ExecutionRoleArn", "network_configuration": "NetworkConfiguration", "source_bucket_arn": "SourceBucketArn", "airflow_version": "AirflowVersion", "environment_class": "EnvironmentClass", "max_workers": "MaxWorkers", "min_workers": "MinWorkers", "schedulers": "Schedulers", "webserver_access_mode": "WebserverAccessMode", "weekly_maintenance_window_start": "WeeklyMaintenanceWindowStart", "airflow_configuration_options": "AirflowConfigurationOptions", "logging_configuration": "LoggingConfiguration", "requirements_s3_path": "RequirementsS3Path", "plugins_s3_path": "PluginsS3Path", "startup_script_s3_path": "StartupScriptS3Path", } for snake_key, value in params.items(): if snake_key in param_mapping: boto_params[param_mapping[snake_key]] = value response = self.mwaa_client.update_environment(**boto_params) return {"Arn": response["Arn"]} - awslabs/mwaa_mcp_server/server.py:131-199 (registration)The registration of the update_environment tool in the MCP server, which maps to the tools.update_environment handler.
@mcp.tool(name="update_environment") async def update_environment( name: str, dag_s3_path: Optional[str] = None, execution_role_arn: Optional[str] = None, network_configuration: Optional[Dict[str, Any]] = None, source_bucket_arn: Optional[str] = None, airflow_version: Optional[str] = None, environment_class: Optional[str] = None, max_workers: Optional[int] = None, min_workers: Optional[int] = None, schedulers: Optional[int] = None, webserver_access_mode: Optional[str] = None, weekly_maintenance_window_start: Optional[str] = None, airflow_configuration_options: Optional[Dict[str, str]] = None, logging_configuration: Optional[Dict[str, Any]] = None, requirements_s3_path: Optional[str] = None, plugins_s3_path: Optional[str] = None, startup_script_s3_path: Optional[str] = None, ) -> Dict[str, Any]: """Update an existing MWAA environment configuration. Only provide the parameters you want to change. Args: name: Environment name dag_s3_path: S3 path to DAGs folder execution_role_arn: IAM role ARN network_configuration: VPC configuration source_bucket_arn: S3 bucket ARN airflow_version: Apache Airflow version environment_class: Environment size max_workers: Maximum workers min_workers: Minimum workers schedulers: Number of schedulers webserver_access_mode: Access mode weekly_maintenance_window_start: Maintenance window airflow_configuration_options: Configuration overrides logging_configuration: Logging settings requirements_s3_path: Path to requirements.txt plugins_s3_path: Path to plugins.zip startup_script_s3_path: Path to startup script Returns: Dictionary containing the environment ARN """ max_workers_int = int(max_workers) if max_workers is not None else None min_workers_int = int(min_workers) if min_workers is not None else None schedulers_int = int(schedulers) if schedulers is not None else None return await tools.update_environment( name=name, dag_s3_path=dag_s3_path, execution_role_arn=execution_role_arn, network_configuration=network_configuration, source_bucket_arn=source_bucket_arn, airflow_version=airflow_version, environment_class=environment_class, max_workers=max_workers_int, min_workers=min_workers_int, schedulers=schedulers_int, webserver_access_mode=webserver_access_mode, weekly_maintenance_window_start=weekly_maintenance_window_start, airflow_configuration_options=airflow_configuration_options, logging_configuration=logging_configuration, requirements_s3_path=requirements_s3_path, plugins_s3_path=plugins_s3_path, startup_script_s3_path=startup_script_s3_path, )