简体   繁体   English

Kafka Connect JDBC Sink-一个接收器配置中每个主题(表)的pk.fields

[英]Kafka Connect JDBC Sink - pk.fields for each topic (table) in one sink configuration

With respect to this example debezium-example 关于这个例子debezium-example

I have multiple topics with different primary keys 我有多个主题具有不同的主键

item (pk : id)
itemDetail (pk :id, itemId)
itemLocation (pk :id, itemId)

jdbc-sink.source JDBC-sink.source

{
"name": "jdbc-sink",
"config": {
    "connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
    "tasks.max": "1",
    "topics": "item,itemDetail,itemLocation",
    "connection.url": "jdbc:postgresql://postgres:5432/inventory?user=postgresuser&password=postgrespw",
    "transforms": "unwrap",
    "transforms.unwrap.type": "io.debezium.transforms.UnwrapFromEnvelope",
    "auto.create": "true",
    "insert.mode": "upsert",
    "pk.fields": "id",
    "pk.mode": "record_value"
}
}

how we can specify "pk.fields" for each topic (table)? 如何为每个主题(表)指定“ pk.fields”?

I don't think there is such a configuration for a PK mapping per topic. 我认为每个主题的PK映射都没有这样的配置。

You will want to make multiple configs for each topic 您将需要为每个主题进行多个配置

{
"name": "jdbc-sink-item",
"config": {
    "connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
    "tasks.max": "1",
    "topics": "item",
    "pk.fields": "id",

And

{
"name": "jdbc-sink-itemDetail",
"config": {
    "connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
    "tasks.max": "1",
    "topics": "itemDetail",
    "pk.fields": "id,itemId",

And so on 等等

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM