[英]Kafka JDBC sink connector - is it possible to store the topic data as a json in DB
Kafka JDBC sink connector - is it possible to store the topic data as a json into the postgre DB. Kafka JDBC 接收器连接器 - 是否可以将主题数据作为 json 存储到 postgre 数据库中。 Currently it parse each json data from Topic and map it to the corresponding column in the table.
目前它将Topic和map中的每个json数据解析到表中的对应列。
If anyone has worked on a similar case, can you please help me what are the config details I should add inside the connector code.如果有人处理过类似的案例,请您帮我在连接器代码中添加哪些配置详细信息。
I used the below code.我使用了下面的代码。 But, it didn't work.
但是,它没有用。
"key.converter":"org.apache.kafka.connect.storage.StringConverter",
"key.converter.schemas.enable":"false",
"value.converter":"org.apache.kafka.connect.json.JsonConverter",
"value.converter.schemas.enable":"false"
The JDBC sink requires a Struct type (JSON with Schema, Avro, etc) JDBC 接收器需要 Struct 类型(带有 Schema、Avro 等的 JSON)
If you want to store a string, that string needs to be the value of a key that corresponds to a database column.如果要存储字符串,则该字符串需要是与数据库列对应的键的值。 That string can be anything, including delimited JSON
该字符串可以是任何内容,包括分隔的 JSON
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.