简体   繁体   中英

writing pandas dataframe with Nested structure to DynamoDB using Python and AWS Lambda

I am trying to write the Pandas dataframe to DynamoDB table. This Frame have nested objects

{
  "PK": {
    "S": "2"
  },
  "SK": {
    "S": "INFO"
  },
  "001": {
    "M": {
      "New_Some": {
        "N": "6"
      },
      "New_Some1": {
        "N": "2"
      },
      "New_Some2": {
        "N": "1"
      },
      "New_Some3": {
        "N": "1"
      }
    }
  },
  "status": {
    "S": "New"
  },
  "ModelVals": {
    "L": [
      {
        "M": {
          "Models": {
            "L": [
              {
                "S": "ABC123"
              }
            ]
          },
          "class": {
            "S": "XYZ222"
          }
        }
      }
    ]
  },

My pandas dataframe columns contains list and dicts as below -

column1- ["mfg_nom", "mfg_nom", "mfg_nom", "mfg_nom"]
column2 - ["ZZY", "ZZY", "XYZ", "XYZ"]
column3 - ["1", "2", "2", "1"]

and 

column4 - {"New_Some": "0.000"}
column5 - {"New_Some1": "636.000"}
column6 - {}
column7 - {"Insta": 7, "Other": 7}

pandas dataframe contains multiple nested lists and dictionaries. How can I create this row to insert into dynamoDB. So far I have tried below but its only working with String not Arrays

    df = wr.s3.read_parquet(path=s3_path, 
                            dataset=dataset,
                            path_suffix = path_suffix).isna()    

with table.batch_writer() as batch:
        for index, row in df.iterrows():
            print(row.to_json())
            batch.put_item(json.loads(row.to_json(), parse_float=Decimal))

getting error like -

"An error occurred (ValidationException) when calling the BatchWriteItem operation: The provided key element does not match the schema",
 

My suggestion would be to use AWS Glue instead of Lambda which has a built in DynamoDB connector that will allow you to read from S3 and directly write to DynamoDB.

https://docs.aws.amazon.com/glue/latest/dg/aws-glue-programming-etl-connect.html#aws-glue-programming-etl-connect-dynamodb

However if you must use Lambda you can make use of awswrangler for DynamoDB:

https://aws-sdk-pandas.readthedocs.io/en/stable/stubs/awswrangler.dynamodb.put_df.html

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM