简体   繁体   中英

How to enable Kafka sink connector to insert data from topics to tables as and when sink is up

I have developed kafka-sink-connector (using confluent-oss-3.2.0-2.11, connect framework) for my data-store (Amppol ADS), which stores data from kafka topics to corresponding tables in my store.

Every thing is working as expected as long as kafka servers and ADS servers are up and running.

Need a help/suggestions about a specific use-case where events are getting ingested in kafka topics and underneath sink component (ADS) is down. Expectation here is Whenever a sink servers comes up, records that were ingested earlier in kafka topics should be inserted into the tables;

Kindly advise how to handle such a case.

Is there any support available in connect framework for this..? or atleast some references will be a great help.

SinkConnector offsets are maintained in the _consumer_offsets topic on Kafka against your connector name and when SinkConnector restarts it will pick messages from Kafka server from the previous offset it had stored on the _consumer_offsets topic.

So you don't have to worry anything about managing offsets. Its all done by the workers in the Connect framework. In your scenario you go and just restart your sink connector. If the messages are pushed to Kafka by your source connector and are available in the Kafka, sink connector can be started/restarted at any time.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM