s3_object_upload
Upload a file to an Amazon S3 bucket using base64 encoded content. Specify the bucket name, object key, and file content for direct storage management.
Instructions
Upload an object to S3
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| bucket_name | Yes | Name of the S3 bucket | |
| file_content | Yes | Base64 encoded file content for upload | |
| object_key | Yes | Key/path of the object in the bucket |
Implementation Reference
- src/mcp_server_aws/server.py:155-159 (handler)Executes the s3_object_upload tool by decoding base64 file content and uploading it to the specified S3 bucket and key using boto3 S3 client.elif name == "s3_object_upload": response = s3_client.upload_fileobj( io.BytesIO(base64.b64decode(arguments["file_content"])), arguments["bucket_name"], arguments["object_key"])
- src/mcp_server_aws/tools.py:42-63 (schema)Defines the input schema and metadata for the s3_object_upload tool.Tool( name="s3_object_upload", description="Upload an object to S3", inputSchema={ "type": "object", "properties": { "bucket_name": { "type": "string", "description": "Name of the S3 bucket" }, "object_key": { "type": "string", "description": "Key/path of the object in the bucket" }, "file_content": { "type": "string", "description": "Base64 encoded file content for upload" } }, "required": ["bucket_name", "object_key", "file_content"] } ),
- src/mcp_server_aws/server.py:136-140 (registration)Registers the s3_object_upload tool (among others) by including it in the list_tools response via get_aws_tools().async def list_tools() -> list[Tool]: """List available AWS tools""" logger.debug("Handling list_tools request") return get_aws_tools()
- src/mcp_server_aws/server.py:141-181 (handler)The dispatch function handle_s3_operations that contains the specific handler logic for s3_object_upload and other S3 tools, called from the main call_tool handler.async def handle_s3_operations(aws: AWSManager, name: str, arguments: dict) -> list[TextContent]: """Handle S3-specific operations""" s3_client = aws.get_boto3_client('s3') response = None if name == "s3_bucket_create": response = s3_client.create_bucket(Bucket=arguments["bucket_name"], CreateBucketConfiguration={ 'LocationConstraint': os.getenv("AWS_REGION") or 'us-east-1' }) elif name == "s3_bucket_list": response = s3_client.list_buckets() elif name == "s3_bucket_delete": response = s3_client.delete_bucket(Bucket=arguments["bucket_name"]) elif name == "s3_object_upload": response = s3_client.upload_fileobj( io.BytesIO(base64.b64decode(arguments["file_content"])), arguments["bucket_name"], arguments["object_key"]) elif name == "s3_object_delete": response = s3_client.delete_object( Bucket=arguments["bucket_name"], Key=arguments["object_key"] ) elif name == "s3_object_list": response = s3_client.list_objects_v2( Bucket=arguments["bucket_name"]) elif name == "s3_object_read": logging.info(f"Reading object: {arguments['object_key']}") response = s3_client.get_object( Bucket=arguments["bucket_name"], Key=arguments["object_key"] ) content = response['Body'].read().decode('utf-8') return [TextContent(type="text", text=content)] else: raise ValueError(f"Unknown S3 operation: {name}") aws.log_operation("s3", name.replace("s3_", ""), arguments) return [TextContent(type="text", text=f"Operation Result:\n{json.dumps(response, indent=2, default=custom_json_serializer)}")]
- src/mcp_server_aws/server.py:357-359 (registration)Dispatches s3_object_upload calls to the S3 handler within the main @server.call_tool() method.if name.startswith("s3_"): return await handle_s3_operations(aws, name, arguments) elif name.startswith("dynamodb_"):