简体   繁体   English

CLI将数据放入AWS Firehose

[英]CLI to put data into AWS Firehose

AWS Firehose was released today. AWS Firehose今天发布。 I'm playing around with it and trying to figure out how to put data into the stream using AWS CLI. 我正在玩它并试图弄清楚如何使用AWS CLI将数据放入流中。 I have a simple JSON payload and the corresponding Redshift table with columns that map to the JSON attributes. 我有一个简单的JSON有效负载和相应的Redshift表,其中的列映射到JSON属性。 I've tried various combinations but I can't seem to pass in the JSON payload via the cli. 我尝试了各种组合,但我似乎无法通过cli传递JSON有效负载。

What I've tried: 我尝试过的:

aws firehose put-record --delivery-stream-name test-delivery-stream --record '{ "attribute": 1 }'

aws firehose put-record --delivery-stream-name test-delivery-stream --record { "attribute": 1 }

aws firehose put-record --delivery-stream-name test-delivery-stream --record Data='{ "attribute": 1 }'

aws firehose put-record --delivery-stream-name test-delivery-stream --record Data={ "attribute": 1 }

aws firehose put-record --delivery-stream-name test-delivery-stream --cli-input-json '{ "attribute": 1 }'

aws firehose put-record --delivery-stream-name test-delivery-stream --cli-input-json { "attribute": 1 }

I've looked at the cli help which hasn't helped. 我看过没有帮助的cli帮助。 This article was published today but looks like the command they use is already outdated as the argument "--firehose-name" has been replaced by "--delivery-stream-name". 这篇文章今天发布,但看起来他们使用的命令已经过时,因为参数“--firehose-name”已被“--delivery-stream-name”取代。

转义blob中键和值周围的双引号:

aws firehose put-record --delivery-stream-name test-delivery-stream --record '{"Data":"{\"attribute\":1}"}'

我的凭据和区域有问题,但这种语法至少让我解决了以前的解析错误:

aws firehose put-record --cli-input-json '{"DeliveryStreamName":"testdata","Record":{"Data":"test data"}}'

couple of things: 几件事:

  • did you create the delivery stream? 你创建了传输流吗?
  • by reading the doc it seems that you should do --cli-input-json '{"Data":"blob"}' or --record 'Data=blob' 通过阅读文档,你似乎应该做--cli-input-json'{“Data”:“blob”}'或--record'Data = blob'
  • try using --generate-cli-skeleton on the cli for the put-record/firehose to see an example 尝试在cli上使用--generate-cli-skeleton进行put-record / firehose查看示例

This should work. 这应该工作。 Escape all quotes.replace strem_name with your stream name. 使用您的流名称转义所有quotes.replace strem_name

aws firehose put-record --cli-input-json "{\"DeliveryStreamName\":\"strem_name\",\"Record\":{\"Data\":\"test data\"}}"

This is what I have tried and it has worked. 这是我尝试过的,它已经奏效了。

Below is the example for sending JSON records with Single Column and multiple columns. 下面是使用单列和多列发送JSON记录的示例。

Single Value in the Data: 数据中的单一值:

Example: Sending a single column which is an integer. 示例:发送一个整数的列。

aws firehose put-record --delivery-stream-name test-delivery-stream --record='Data="{\"attribute\":1}"'

Multiple column values in the data : 数据中的多个列值:

Example: Sending Integer and String values via Put-record 示例:通过Put-record发送Integer和String值

aws firehose put-record --delivery-stream-name test-delivery-stream --record='Data="{\"attribute_0\":1,\"attribute_1\":\"Sample String Value\"}"'

Example: Sending Integer,String and float values via Put-record 示例:通过Put-record发送Integer,String和float值

aws firehose put-record --delivery-stream-name test-delivery-stream --record='Data="{\"attribute_0\":1,\"attribute_1\":\"Sample String Value\",\"attribute_2\":\"14.9\"}"'

Acknowledgement of Success : 确认成功:

When the record is sent successfully, kinesis acknowledges it with a record id , which is similar to the one below. 当记录成功发送时,kinesis会以记录ID识别它,这与下面的记录相似。

{
"RecordId": "fFKN2aJfUh6O8FsvlrfkowDZCpu0sx+37JWKJBRmN++iKTYbm/yMKE4dQHdubMR4i+0lDP/NF3c+4y1pvY9gOBkqIn6cfp+1DrB9YG4a0jXmopvhjrXrqYpwo+s8I41kRDKTL013c65vRh5kse238PC7jQ2iOWIqf21wq4dPU9R5qUbicH76soa+bZLvyhGVPudNNu2zRyZwCCV0zP/goah54d/HN9trz"

} }

This indicates that the put-record command has succeeded. 这表示put-record命令已成功。

Streamed Record on S3: S3上的流式记录:

This is how the record the record looks like in S3 after kinesis has processed it into S3. 这是在kinesis将其处理成S3之后记录在S3中的记录。

{"attribute":1}
{"attribute_0":1,"attribute_1":"Sample String Value"}
{"attribute_0":1,"attribute_1":"Sample String Value","attribute_2":"14.9"}

Note : In S3, the records are created in single or multiple files depending on the rate in which we issue the put-record command. 注意:在S3中,记录是在单个或多个文件中创建的,具体取决于发出put-record命令的速率。

Please do try and comment if this works. 如果有效,请尝试评论。

Thanks & Regards, Srivignesh KN 谢谢和问候,Srivignesh KN

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM