简体   繁体   English

Kafka Connect JDBC Sink Connector:如何删除没有 NULL 值的记录?

[英]Kafka Connect JDBC Sink Connector: How to delete a record that doesn't have a NULL value?

Is there a (recommended) way to delete a record from a Kafka Connect JDBC Sink Connector where the record's value is not NULL?是否有(推荐的)方法可以从 Kafka Connect JDBC Sink 连接器中删除记录值不为 NULL 的记录?

For example, if my JSON configuration includes the following:例如,如果我的 JSON 配置包括以下内容:

...
"delete.enabled": "true",
"pk.mode": "record_key",
...

And my record's value is non-null, is there a way to have that record be deleted in the database?并且我的记录的值是非空的,有没有办法在数据库中删除该记录?

I ask because the record's value has a field that marks if it should be deleted ie a column like "Operation" where "Operation" == "D" should be a delete in the database via JDBC.我问是因为记录的值有一个字段来标记它是否应该被删除,即像“Operation”这样的列,其中“Operation”==“D”应该是通过JDBC在数据库中删除。

If there is a standard/recommended way to do this I would love to hear it.如果有标准/推荐的方法来做到这一点,我很乐意听到。 My only other idea was to make a custom transform that checks the "Operation" column for the value "D" and if it is a match, we pass back the record with the PK intact but with the value set to NULL aka a tombstone record which should get picked up by the connector as a delete operation.我唯一的其他想法是进行自定义转换,检查“操作”列中的值“D”,如果匹配,我们将 PK 完好无损地传回记录,但将值设置为 NULL 又名墓碑记录连接器应该将其作为删除操作拾取。 Is that a possibility?有这种可能吗?

I appreciate any help, thank you!我感谢任何帮助,谢谢!

No responses yet but I got my somewhat hacky solution to work:还没有回应,但我得到了我有点笨拙的解决方案:

  • Created a custom Transform that sets value of the record to NULL (makes a tombstone record) if a certain condition is met (in my case this is checking a field in the record's value)如果满足某个条件,则创建一个自定义转换,将记录的值设置为 NULL(生成墓碑记录)(在我的情况下,这是检查记录值中的一个字段)
  • Transform returns original record if condition is not met如果不满足条件,转换返回原始记录
  • Packaged into a JAR打包成JAR
  • Provided JAR on the "plugin.path"在“plugin.path”上提供了 JAR
  • Make sure "delete.enabled":"true" and "pk.mode":"record_key" so that tombstone records are actually deleted确保 "delete.enabled":"true" 和 "pk.mode":"record_key" 以便实际删除墓碑记录
  • When sending POST request to instantiate a connector, include the transform and any relevant configuration in body of POST发送 POST 请求以实例化连接器时,在 POST 正文中包含转换和任何相关配置

Hope this helps希望这可以帮助

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Kafka Connect JDBC接收器连接器 - Kafka Connect JDBC Sink Connector Kafka JDBC 接收器连接器尝试在删除时更新目标表但失败 - Kafka JDBC sink connector trying to update destination table on delete and fails 如何通过 JDBC 接收器连接器编写 kafka 记录键以分隔列(不是主键)? - How to write kafka record key to separate column (not primary key) by JDBC sink connector? Kafka Connect-JSON转换器-JDBC Sink连接器-列类型JSON - Kafka Connect - JSON Converter - JDBC Sink Connector - Column Type JSON Kafka JDBC Sink Connector 在雪花中找不到表 - Kafka JDBC Sink Connector can't find tables in Snowflake 如何在 Kafka JDBC 接收器连接器上使用表名转换? - How to use table name transforms on Kafka JDBC sink connector? Kafka-Connect JDBC Sink 在 upsert 期间报告空 ID - Kafka-Connect JDBC Sink reports null id during upsert Kafka Connect for Microsoft SQL Server的jdbc接收器。 它适用于record_value的多个键,并且为record_key弹出此错误 - kafka connect for jdbc sink for microsoft sql server. it works for multiple keys for record_value and this error is poping up for record_key Kafka Connect JDBC 接收器提交 - Kafka Connect JDBC Sink Commits 使用Kafka Connect API JDBC Sink Connector示例的Oracle数据库的Kafka主题 - Kafka Topic to Oracle database using Kafka Connect API JDBC Sink Connector Example
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM