简体   繁体   中英

Unloading Snowflake data to S3 Location directly with Canned ACL

I am trying to unload the results of a particular query in Snowflake to an S3 location directly.

copy into 's3://bucket-name/folder/text.csv' 
from <Some SQL Query>
file_format = (type = CSV file_extension = '.csv' field_optionally_enclosed_by = NONE empty_field_as_null = false) 
max_file_size = 5000000000 
storage_integration = aws 
single = true;

The problem with this is after the write is successful, the bucket owner cannot read the new file from S3 because of the ACL. So, how do you add the canned ACL of "Bucket-Owner-Full-Control" while writing to the S3 from Snowflake? And I am not much into Google Cloud Storage, what will be the scenario in GCS buckets??

You might not be able to add a canned ACL to your COPY INTO statement, however what you can do is to add the required parameter to the Storage Integration.

When you create your Storage Integration or if you have to update it, please add this to the statement. STORAGE_AWS_OBJECT_ACL = 'bucket-owner-full-control'

This should ensure whatever data you unload to a bucket from Snowflake will let the the bucket owner have full control over the object.

https://docs.snowflake.com/en/user-guide/data-unload-s3.html#configuring-support-for-amazon-s3-access-control-lists-optional

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM