简体   繁体   中英

Problem Using ASW Commons/S3 Libraries to Import Data from RDS Instance to PostgreSQL 11.1

I have been experimenting and reading posts and looking at documentation for a long time to no great effect but I have a coding problem with PostgreSQL import from S3. We upgraded and moved our database to RDS to be able to use this feature, but I am fairly new to both PostgreSQL and AWS and struggling with it. I am running:

  • PostgreSQL 11.1 on Amazon RDS
  • DBeaver 6.1.5 as a client.

Given this simple script, running at this time within SQL query of DBeaver as a client, I cannot get it to work and have tried many things. This is my very first post on Stack Overflow as well.


declare s3_uri varchar(128);
 begin
  SELECT aws_commons.create_s3_uri(
  'aduro-data-dev',
   'semarchy/inbound/reference_data/AffiliateType.csv',
   'us-west-2'
 ) as s3_uri --\gset;

raise notice 's3_uri=[%]', s3_uri;

SELECT aws_s3.table_import_from_s3(
   'aduro_extensions.sa_affiliate_type_import ', '', '(format csv)',
   's3_uri', 
   aws_commons.create_aws_credentials('removed ', 'removed', '')
);

end $$

When I run the above, I get the following error:

SQL Error [42601]: ERROR: query has no destination for result data Hint: If you want to discard the results of a SELECT, use PERFORM instead. Where: PL/pgSQL function inline_code_block line 5 at SQL statement

What I have tried:

  • The "\gset" command is from the AWS examples for doing this, but it results in a syntax error if I have it there so it is commented out, I think because that is intended for more of a command-line interface. It is supposed to put the resulting composite value from the SELECT into the s3_uri variable.

  • If I change the line of ") AS s3_uri --\gset;" to ") INTO s3_uri --\gset;" the error changes to:

SQL Error [22P02]: ERROR: malformed record literal: "s3_uri" Detail: Missing left parenthesis. Where: PL/pgSQL function inline_code_block line 13 at SQL statement

The output of the RAISE statement for the S3_uri value shows as:

00000: s3_uri=[("(aduro-data-dev,semarchy/inbound/reference_data/AffiliateType.csv,us-west-2)")]

It's an old question, but for anyone else trying, put the s3_uri creation into the table import statement:

SELECT aws_s3.table_import_from_s3(
   'aduro_extensions.sa_affiliate_type_import ', '', '(format csv)',
   (SELECT aws_commons.create_s3_uri(
      'aduro-data-dev',
      'semarchy/inbound/reference_data/AffiliateType.csv',
      'us-west-2')
   ), 
   aws_commons.create_aws_credentials('removed ', 'removed', '')
);

To implement on Lambda:

import json
import boto3
import os
import psycopg2

def getCredentials():
  credential = {}
  secret_name = os.environ['SECRET_NAME']
  region_name = os.environ['REGION']
  client = boto3.client(
    service_name='secretsmanager',
    region_name=region_name
  )

  get_secret_value_response = client.get_secret_value(
    SecretId=secret_name
  )
  
  secret = json.loads(get_secret_value_response['SecretString'])
  
  credential['username'] = secret['username']
  credential['password'] = secret['password']
  credential['host'] = os.environ['DB_HOST']
  credential['db'] = os.environ['DB_DB']
  
  return credential

def handler(event, context):
  credential = getCredentials()
  connection = psycopg2.connect(user=credential['username'], password=credential['password'], host=credential['host'], database=credential['db'])
  cursor = connection.cursor()
  cursor.execute("SELECT aws_s3.table_import_from_s3( '"+os.environ['DB_TABLE']+"', '',  '(format csv)', (SELECT aws_commons.create_s3_uri( '"+os.environ['BUCKET']+"', '"+os.environ['IMPORT_FILE']+"', '"+os.environ['REGION']+"')));")
  results = cursor.fetchall()
  cursor.close()
  connection.commit()
  
  return results

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM