简体   繁体   中英

Char length exceeds DDL length

I am attempting to copy data into redshift from an S3 bucket, however I am getting a 1204 error code 'char length exceeds DDL length'.

copy table_name from '[data source]'
access_key_id '[access key]'
secret_access_key '[secret access key]'
region 'us-east-1'
null as 'NA'
delimiter ','
removequotes;

The error occurs in the very first row, where it tries to put the state abbreviation 'GA' into the data_state column which is defined with the data type char(2). When I query the stl_load_errors table I get the following result:

line_number  colname      col_length  type   raw_field_value  err_code  err_reason
1            data_state   2           char   GA               1204      Char length exceeds DDL length                                                                                                  

As far as I can tell that shouldn't exceed the length as it is two characters and it is set to char(2). Does anyone know what could be causing this?

通过将数据类型更改为 char(3) 使其工作,但是仍然不确定为什么 char(2) 不起作用

Mine did this as well, for a state column too. Redshift defaults char to char(1) - so I had to specify char(2) - are you sure it didn't default back to char(1) because mine did

Open the file up with a Hex editor, or use an online one here , and look at the GA value in the data_state column.

If it has three dots before it like so:

...GA

Then the file (or when it was orignally created) was UTF-8-BOM not just UTF-8 .

You can open the file in something like Notepad++ and go to Encoding in the top bar then select Convert to UTF-8 .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM