[英]How to add multiple topics in JDBC Sink Connector configuration and get topics data in multiple target tables?
Below is my JDBC-Sink connector configuration:下面是我的 JDBC-Sink 连接器配置:
connector.class=io.confluent.connect.jdbc.JdbcSinkConnector
behavior.on.null.values=ignore
table.name.format=kafka_Address_V1, kafka_Attribute_V1
connection.password=***********
topics=Address,Attribute
task.max=3
batch.size=500
value.converter.value.subject.name.strategy=io.confluent.kafka.serializers.subject.RecordNameStrategy
value.converter.schema.registry.url=http://localhost:8081
auto.evolve=true
connection.user=user
name=sink-jdbc-connector
errors.tolerance=all
auto.create=true
value.converter=io.confluent.connect.avro.AvroConverter
connection.url=jdbc:sqlserver://localhost:DB;
insert.mode=upsert
key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://localhost:8081
pk.mode=record_value
pk.fields=id
If I use this configuration I am getting single table in target database in this kafka_Address_V1, kafka_Attribute_V1 format, which is combination of these two.如果我使用此配置,我将在目标数据库中以这种 kafka_Address_V1、kafka_Attribute_V1 格式获取单个表,这是这两者的组合。
Please let me know how can I use to store different topics data in different Tables by using JDBC-Sink Connector.请让我知道如何使用 JDBC-Sink 连接器将不同的主题数据存储在不同的表中。
Per the docs , table.name.format
takes a single value, and defaults to using the topic name itself.根据docs ,
table.name.format
采用单个值,并默认使用主题名称本身。
To achieve what you want you can use the RegExRouter
Single Message Transform to modify the topic as its processed by Kafka Connect要实现您想要的,您可以使用
RegExRouter
Single Message Transform将主题修改为由 Kafka Connect 处理
Try this:尝试这个:
transforms =changeTopicName
transforms.changeTopicName.type =org.apache.kafka.connect.transforms.RegexRouter
transforms.changeTopicName.regex =(.*)
transforms.changeTopicName.replacement =kafka_$1_V1
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.