I have a producer class sends to a topic using custom JsonSerializer from Github
public class JsonSerializer<T> implements Serializer<T> {
...
@Override
public byte[] serialize(String topic, T data) {
try {
return this.objectMapper.writeValueAsBytes(data);
} catch (JsonProcessingException e) {
throw new SerializationException(e);
}
}
...
}
And I am running Datastax Kafka Connector using these configuration:
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=false
I got these error while the connector trying to consume topic:
[2020-01-12 13:57:53,324] WARN Error inserting/updating row for Kafka record SinkRecord{kafkaOffset=416, timestampType=CreateTime} ConnectRecord{topic='test-3', kafkaPartition=17, key=null, keySchema=Schema{STRING}, value={}, valueSchema=null, timestamp=1578811437723, headers=ConnectHeaders(headers=)}: Primary key column(s) mmsi, ts cannot be left unmapped. Check that your mapping setting matches your dataset contents. (com.datastax.kafkaconnector.DseSinkTask:286)
From that error, I am thinking Connector unable to retrieve Json data. What am I doing wrong?
UPDATE
I tried Kafka JsonSerializer.
I tried StringSerializer, as connector said it is supported also.
I found that some data actually written to database, but it is always relative small number compared to total data sent by kafka topic. About 5 to 10 data.
I tried to keep connector running, and I found after it failed wrote, it will not write anymore.
Actually it is configuration related problem. As I mentioned in update, it never wrote data anymore in case of error.
It is because Datastax have configuration ignoreErrors
those have default value false
. It means if Connector found an error in a message, it will retry it indefinitely. I set it to true, and problem solved.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.