简体   繁体   中英

Kafka Connect JDBC Sink Connector

I am trying to write data from a topic (json data) into a MySql Database. I believe I want a JDBC Sink Connector.

How do I configure the connector to map the json data in the topic to how to insert data into the database.

The only documentation I can find is this.

"The sink connector requires knowledge of schemas, so you should use a suitable converter eg the Avro converter that comes with Schema Registry, or the JSON converter with schemas enabled. Kafka record keys if present can be primitive types or a Connect struct, and the record value must be a Connect struct. Fields being selected from Connect structs must be of primitive types. If the data in the topic is not of a compatible format, implementing a custom Converter may be necessary."

But how do you configure? Any examples?

I assume that means you need to use Confluent Schema Registry?

For "better" schema support, then yes. But no, it is not required.

You can use the JsonConverter with schemas.enable=true

Your JSON messages will need to look like this though,

{
   "schema" : {
      ... data that describes the payload
   }, 
   "payload": {
      ... your actual data
   }
}

For reference to this format, you can see this blog

You can use Kafka Streams or KSQL to more easily convert "schemaless" JSON to a schema-d Avro payload

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM