简体   繁体   中英

kafka-connect-elasticsearch: how to send deletes of documents?

I have a processing stream that looks like this:

mysql.database -> debezium-connector -> database topic -> faust.agent(stream processing to add a field) -> sink topic -> elasticsearch-sink-connector -> elasticsearch cluster

This processing stream is working for the most part, but I'm having trouble figuring out how to handle deleted row events coming from the database topic. As in if a row gets deleted I want it to also be removed from elasticsearch. I can use a conditional in the faust section that could manipulate the event. Is there a way to mark an event so that when it's picked up by the elasticsearch-sink-connector it removes a given document instead of adding it? I've looked through the documentation, but I don't see specifics on this. Is the sink connector meant only to add documents to an index?

Looking at the config for the connector it looks like you can set behavior.on.null.values to delete . Then you just need to make sure you set a tombstone (null) against keys for which the document should be deleted.

Debezium will by default generate tombstone messages for deletes.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM