[英]kafka-connect-elasticsearch: how to send deletes of documents?
I have a processing stream that looks like this:我有一个看起来像这样的处理流:
mysql.database -> debezium-connector -> database topic -> faust.agent(stream processing to add a field) -> sink topic -> elasticsearch-sink-connector -> elasticsearch cluster
This processing stream is working for the most part, but I'm having trouble figuring out how to handle deleted row events coming from the database topic.此处理流在大多数情况下都有效,但我无法弄清楚如何处理来自数据库主题的已删除行事件。 As in if a row gets deleted I want it to also be removed from elasticsearch.
如果一行被删除,我希望它也从 elasticsearch 中删除。 I can use a conditional in the faust section that could manipulate the event.
我可以在可以操纵事件的 faust 部分使用条件。 Is there a way to mark an event so that when it's picked up by the elasticsearch-sink-connector it removes a given document instead of adding it?
有没有办法标记一个事件,以便当它被 elasticsearch-sink-connector 拾取时它会删除给定的文档而不是添加它? I've looked through the documentation, but I don't see specifics on this.
我已经浏览了文档,但我没有看到具体的内容。 Is the sink connector meant only to add documents to an index?
接收器连接器是否仅用于将文档添加到索引?
Looking at the config for the connector it looks like you can set behavior.on.null.values
to delete
.查看连接器的配置,您似乎可以将
behavior.on.null.values
设置为delete
。 Then you just need to make sure you set a tombstone (null) against keys for which the document should be deleted.然后,您只需要确保针对应删除文档的键设置墓碑(空)。
Debezium will by default generate tombstone messages for deletes. 默认情况下, Debezium 将生成用于删除的墓碑消息。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.