简体   繁体   中英

How to make Instaclustr Kafka Sink Connector work with Avro serialized value to postgres?

I have a Kafka topic of Avro-serialized value.

I am trying to set up a JDBC(postgres) sink connector to dump these messages in the postgres table.

But, I am getting below error.

"org.apache.kafka.common.config.ConfigException: Invalid value io.confluent.connect.avro.AvroConverter for configuration value.converter: Class io.confluent.connect.avro.AvroConverter could not be found."

My Sink.json is

{"name": "postgres-sink",
  "config": {
    "connector.class":"io.confluent.connect.jdbc.JdbcSinkConnector",
    "tasks.max":"1",
    "topics": "<topic_name>",
    "key.converter": "org.apache.kafka.connect.storage.StringConverter",
    "value.converter": "io.confluent.connect.avro.AvroConverter",
    "value.converter.schema.registry.url": "instaclustr_schema_registry_host:8085",
    "connection.url": "jdbc:postgresql://postgres:5432/postgres?currentSchema=local",
    "connection.user": "postgres",
    "connection.password": "postgres",
    "auto.create": "true",
    "auto.evolve":"true",
    "pk.mode":"none",
    "table.name.format": "<table_name>"
  }
}

Also, I have made changes in the connect-distributed.properties(bootstrap servers).

The command I am running is -

curl -X POST -H "Content-Type: application/json" --data @postgres-sink.json https://<instaclustr_schema_registry_host>:8083/connectors

io.confluent.connect.avro.AvroConverter is not part of the Apache Kafka distribution. You can either just run Apache Kafka as part of Confluent Platform (which ships with the converter and is easier) or you can download it separately and install it yourself.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM