简体   繁体   中英

Permission denied when trying to load into Postgres RDS from S3 with a path that contains the equals sign

I'm trying to load data into Postgres RDS from S3 using the method aws_s3.table_import_from_s3 as explained here: Importing Data into PostgreSQL on Amazon RDS

The file path in S3 contains the equals sign '=' and because of it I'm getting the following error:

ERROR:  HTTP 403. Permission denied. Check bucket or provided credentials as they may no longer be valid.

When replacing the equals sign with another character like underscore '_', the import succeeds.

Example:

SELECT aws_s3.table_import_from_s3(
     'my_table',
     'col1,col2',
     '(format csv)',
     'my_bucket',
     'date=20191031/my_data.csv',
     'eu-west-2'
);

Throws the error mentioned above. (HTTP 403...)

While:

SELECT aws_s3.table_import_from_s3(
     'my_table',
     'col1,col2',
     '(format csv)',
     'my_bucket',
     'date_20191031/my_data.csv',
     'eu-west-2'
);

Succeeds without any problem

Is there any escape char I should add to the equals sign? Any other solution?

I didn't test it, but should work. Just encode = with %3D .

SELECT aws_s3.table_import_from_s3(
 'my_table',
 'col1,col2',
 '(format csv)',
 'my_bucket',
 'date%3D20191031/my_data.csv',
 'eu-west-2');

I also faced this issue of Permission denied. Check bucket or provided credentials as they may no longer be valid Permission denied. Check bucket or provided credentials as they may no longer be valid when using the function aws_s3.table_import_from_s3 .

First tried to URL encode the whole path but that also encodes the / slashes which interferes with the S3 object path separator. Hereby just URL encode everything except the slashes.

PHP example to URL encode

$path = implode('/', array_map('urlencode', explode('/', $path)));

Before - not working

SELECT aws_s3.table_import_from_s3(
   'TABLE_NAME',
   '',
   '(FORMAT csv, HEADER true)',
   aws_commons.create_s3_uri('BUCKET', 'subdirectory/2020-09-10T09:48:40.831Z-6402c3e6-86e6-4a15-8bc3-51d9eb9f08a3.csv', 'aws-region'),
   aws_commons.create_aws_credentials('***', '***', '')
)

After - working

SELECT aws_s3.table_import_from_s3(
   'TABLE_NAME',
   '',
   '(FORMAT csv, HEADER true)',
   aws_commons.create_s3_uri('BUCKET', 'subdirectory/2020-09-10T09%3A48%3A40.831Z-6402c3e6-86e6-4a15-8bc3-51d9eb9f08a3.csv', 'aws-region'),
   aws_commons.create_aws_credentials('***', '***', '')
)

Ideally this would be handled by the function aws_commons.create_s3_uri provided with aws_s3 extension.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM