简体   繁体   中英

Amazon Kinesis Firehose to S3 with Protobuf data

Has anyone tried pushing Google Protobuf (PB) data through Kinesis Firehose for storage to S3. I ask this question because Protobuf is (usually) a binary format, and I recall (perhaps incorrectly) that Firehose runs all data through a base64 conversion before writing. I will need to read the PB data later for processing and want to know if I would need to de-base64 it before usage, or will I have access to raw PB data straight off S3. Thanks.

You could use a Lambda to write the binary data directly to S3. You would need to be able to pull the file from your source from the lambda though, since I believe API Gateway will Base64-encode binary payloads.

You can pass binary data to Kineses Firehose, and Kinesis will base64 encode the data before sending to another AWS Service. Look at the setData() function in the Java documents:

Kinesis Firehose Record

You would then need to decode the data for further processing after reading from S3 for example.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM