I've checked the file manually to make sure nothing exceeds the length. That was all fine, but I doubled the length of every varchar anyway.
I added the TRUNCATECOLUMNS parameter:
TRUNCATECOLUMNS
Truncates data in columns to the appropriate number of characters so that it fits the column specification. Applies only to columns with a VARCHAR or CHAR data type, and rows 4 MB or less in size.
Still getting this error: Copy s3 to redshift: String length exceeds DDL length
COPY [table name]
FROM [s3 path]
iam_role [iam role]
FORMAT CSV
IGNOREHEADER 1
region 'us-west-2'
BLANKSASNULL
TRIMBLANKS
TRUNCATECOLUMNS
It turns out that wasnt the actual problem. This error blankets a number of other things, including text in the date column and strings in a number column.. which was the actual issue.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.