简体   繁体   English

Kafka 连接 s3 sink 多个分区

[英]Kafka connect s3 sink multiple partitions

I have multiple questions about the kafka connect S3 sink connector我对 kafka connect S3 sink connector 有很多疑问

1.I was wondering if its possible using the S3 sink of kafka connect to save records with multiple partitions? 1.我想知道是否可以使用kafka connect的S3 sink来保存多个分区的记录?

for example i have this json record:例如我有这个 json 记录:

{
 "DateA":"UNIXTIMEA",
 "DateB":"UNIXTIMEB",
 "Data":"Some Data"
}

(all fields are top level) (所有字段都是顶级的)

would it be possible to save the data in S3 via the following path:是否可以通过以下路径将数据保存在 S3 中:

s3://sometopic/UNIXTIMEA/UNIXTIMEB s3://sometopic/UNIXTIMEA/UNIXTIMEB

2.Can i transform UNIXTIMEA/UNIXTIMEB to a readable date format without changing the record value itself? 2.我可以在不改变记录值本身的情况下将 UNIXTIMEA/UNIXTIMEB 转换为可读的日期格式吗? (for readability reasons ) (出于可读性原因)

3.Can i add a prefix to UNIXTIMEA in the S3 path? 3. 我可以在 S3 路径中为 UNIXTIMEA 添加前缀吗? for example:例如:

s3://DateA=UNIXTIMEA/DateB=UNIXTIMEB/...

I just starting reading the docs and im slowly getting the hang of things, still i haven't really found straight forward answers to these questions.我刚开始阅读文档,慢慢地掌握了窍门,但我还没有真正找到这些问题的直接答案。

i would like to do basically all of these actions in my configurations but i doubt i could without a custom partitioner, i would like to confirm this as soon as i can.我想在我的配置中基本上执行所有这些操作,但我怀疑如果没有自定义分区程序我是否可以,我想尽快确认这一点。

Thanks in Advance提前致谢

C.potato C.土豆

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM