简体   繁体   English

wr.redshift.to_sql 在 AWS Data Wrangler2.12.1 中失败

[英]wr.redshift.to_sql failed in AWS Data Wrangler2.12.1

awswrangler 2.12.1 awswrangler 2.12.1

I am able to write data.head() into the db, but got error when trying to write all the data.我能够将 data.head() 写入数据库,但在尝试写入所有数据时出错。 The data is copied from another table and did some cleaning before to_sql.数据是从另一个表复制的,并在 to_sql 之前做了一些清理。 I also did data = data.fillna(value=np.nan)我也做了 data = data.fillna(value=np.nan)

wr.redshift.to_sql(data, con, schema="level0",
                   table="test", mode="overwrite")

ProgrammingError: {'S': 'ERROR', 'C': '22001', 'M': 'value too long for type character varying(256)', 'F': '/home/ec2-user/padb/src/pg/src/backend/utils/adt/varchar.c', 'L': '511', 'R': 'varchar'} ProgrammingError: {'S': 'ERROR', 'C': '22001', 'M': 'value too long for type character variables(256)', 'F': '/home/ec2-user/padb/ src/pg/src/backend/utils/adt/varchar.c', 'L': '511', 'R': 'varchar'}

This is a Redshift error and not linked to AWS Data Wrangler.这是 Redshift 错误,与 AWS Data Wrangler 无关。 The exception raised is saying that one of the values you are attempting to write is exceeding the max characters allowed by the column type definition (varchar(256)).引发的异常是说您尝试写入的值之一超出了列类型定义 (varchar(256)) 允许的最大字符数。 More in the docs 文档中的更多内容

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM