简体   繁体   中英

AWS Kinesis Firehose is not inserting data in Redshift

I know this question has been asked several times in Stack Overflow, but none of the answers I read is able to solve what I'm experiencing.

I have a Boto3 script to copy a MySQL table to Kinesis Streams. Then at the other end, there's a KCL Node JS script to read the stream from Kinesis Streams and write to S3. So far so good.

I set up Kinesis Firehose, Redshift and S3 in the same region. I then set up Firehose to read from S3 and write to Redshift. When tested with demo data provided at Firehose, all good.

I then setup a Redshift table which has all the columns from the MySQL table. Some data types are not supported by Redshift and thus I used a different data type but I was confident that Firehose would be able to write to the Redshift table without problem.

Below is the MySQL table screenshot.

MySQL表

And below is the Redshift table screenshot.

红移表

As you can see, the data types are not all the same. I wonder if Redshift is so sensitive to the level that every data type must be the same as the MySQL table.

By the way, I did specify JSON 'auto' in the Firehost COPY command and enabled logging. Unfortunately there's no error logged.

There's a myriad of reasons of why you don't see records in Redshift. After Firehose puts the records in S3, a COPY command is performed from Redshift to get the files from S3 and into the cluster.

If you are not seeing STL_LOAD_ERRORS, then Firehose can't connect to Redshift. You can verify this in the Firehose console, you'll find more information under the Redshift Logs tab.

Also, make sure that you've allowed incoming connections from Firehose's IP range in the region under your VPC for Redshift.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM