简体   繁体   English

当接收器启动时,如何使Kafka接收器连接器能够将主题中的数据插入表中

[英]How to enable Kafka sink connector to insert data from topics to tables as and when sink is up

I have developed kafka-sink-connector (using confluent-oss-3.2.0-2.11, connect framework) for my data-store (Amppol ADS), which stores data from kafka topics to corresponding tables in my store. 我为数据存储区(Amppol AD​​S)开发了kafka-sink-connector(使用confluent-oss-3.2.0-2.11,connect框架),该存储区将kafka主题中的数据存储到我的存储区中的相应表中。

Every thing is working as expected as long as kafka servers and ADS servers are up and running. 只要kafka服务器和ADS服务器启动并运行,一切都会按预期进行。

Need a help/suggestions about a specific use-case where events are getting ingested in kafka topics and underneath sink component (ADS) is down. 需要有关特定用例的帮助/建议,在该特定用例中,事件已吸收到kafka主题中,并且接收器组件(ADS)处于关闭状态。 Expectation here is Whenever a sink servers comes up, records that were ingested earlier in kafka topics should be inserted into the tables; 期望这里是每当接收器服务器出现时,就应该在表中插入先前在kafka主题中提取的记录。

Kindly advise how to handle such a case. 请告知如何处理这种情况。

Is there any support available in connect framework for this..? 连接框架中对此有任何支持吗? or atleast some references will be a great help. 或至少提供一些参考会很有帮助。

SinkConnector offsets are maintained in the _consumer_offsets topic on Kafka against your connector name and when SinkConnector restarts it will pick messages from Kafka server from the previous offset it had stored on the _consumer_offsets topic. SinkConnector偏移量在Kafka的_consumer_offsets主题中与您的连接器名称相对应,当SinkConnector重新启动时,它将从其存储在_consumer_offsets主题中的先前偏移量中选择来自Kafka服务器的消息。

So you don't have to worry anything about managing offsets. 因此,您不必担心管理偏移量。 Its all done by the workers in the Connect framework. 这一切都是由Connect框架中的工作人员完成的。 In your scenario you go and just restart your sink connector. 在您的情况下,您只需重新启动接收器连接器即可。 If the messages are pushed to Kafka by your source connector and are available in the Kafka, sink connector can be started/restarted at any time. 如果消息是通过源连接器推送到Kafka的,并且在Kafka中可用,则接收器连接器可以随时启动/重新启动。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 JDBC 接收器连接器 - 使用 kafka-connect 从多个主题中插入多个表 - 跟进 - JDBC Sink Connector -upserting into multiple tables from multiples topics using kafka-connect - Follow up Kafka Connect:如何使用hdfs sink连接器将Kafka主题的protobuf数据发送到HDFS? - Kafka Connect: How can I send protobuf data from Kafka topics to HDFS using hdfs sink connector? 如何在 JDBC Sink Connector 配置中添加多个主题并获取多个目标表中的主题数据? - How to add multiple topics in JDBC Sink Connector configuration and get topics data in multiple target tables? kafka在接收器连接器属性中连接多个主题 - kafka connect multiple topics in sink connector properties 从Kafka向MongDB传输数据时如何为非json格式的消息配置接收器连接器 - how to configure sink connector for non-json format message when streaming data from Kafka to MongDB Kafka JDBC Sink Connector,批量插入值 - Kafka JDBC Sink Connector, insert values in batches kafka s3 接收器连接器在获取 NULL 数据时崩溃 - kafka s3 sink connector crashed when It gets NULL data 如何将 kafka 写入 kafka 接收器连接器 - how to write kafka to kafka sink connector Kafka连接接收器连接器与多个分区主题 - Kafka connect sink connector with multiple one partitioned topics kafka 点燃水槽连接器 - kafka ignite sink connector
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM