I have a producer with this configuration
kafka:
bootstrap-servers: localhost:9092
cloud:
stream:
binder:
consumer-properties:
key.deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
value.deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
schema.registry.url: http://localhost:8081
properties:
specific.avro.reader: true
schemaRegistryClient:
endpoint: http://localhost:8081
bindings:
event-in-0:
destination: event-details
contentType: application/*+avro
group: group1
function:
definition: event
This is my schema
{
"type": "record",
"name": "Event",
"namespace": "com.example.schema.avro",
"fields": [
{
"name": "eventId",
"type": "int"
}
]
}
and code to publish the message
public void sendEvent(final EventDto eventDto) {
final Event apply = event().apply(eventDto);
final Message<Event> build = MessageBuilder.withPayload(apply)
.setHeader("partitionKey", eventDto.eventId())
.setHeader("customHeader", "test").build();
final boolean send = streamBridge.send("event-out-0", build);
log.info(String.valueOf(build));
}
This produces the correct event as seen in the logs
2022-08-15 09:12:23.988 INFO 25112 --- [nio-9090-exec-5] c.e.eventproducer.service.EventService : GenericMessage [payload={"eventId": 200}, headers={customHeader=test, id=70c3e52c-f419-cf0c-8fae-0ba07d8876da, partitionKey=200, timestamp=1660547543987}]
Now at the consumer end, I expect the same eventId:200 but I always get 0 no matter what the event id is. This is my consumer configuration
kafka:
bootstrap-servers: localhost:9092
cloud:
stream:
binder:
consumer-properties:
key.deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
value.deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
schema.registry.url: http://localhost:8081
properties:
specific.avro.reader: true
schemaRegistryClient:
endpoint: http://localhost:8081
bindings:
event-in-0:
destination: event-details
contentType: application/*+avro
group: group1
function:
definition: event
This is to subscribe the message
@Bean
public Consumer<Message<Event>> event() {
return e -> {
log.info(e.toString());
eventRepository.save(new com.example.eventconsumer.doamin.Event(e.getPayload().getEventId()));
};
}
and the log shows
2022-08-15 09:12:23.992 INFO [,b19cd949219432acf232403bdcea45c2,1464ff27cfed8011] 24596 --- [container-0-C-1] c.e.eventconsumer.service.EventService : GenericMessage [payload={"eventId": 0}, headers={customHeader=test, deliveryAttempt=1, kafka_timestampType=CREATE_TIME, scst_partition=0, kafka_receivedTopic=event-details, kafka_offset=97, partitionKey=200, scst_nativeHeadersPresent=true, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer@41d9d95, source-type=kafka, id=0762521f-069b-a91d-acc5-69e6bb2eb4eb, kafka_receivedPartitionId=0, contentType=application/vnd.event.v1+avro, kafka_receivedTimestamp=1660547543987, kafka_groupId=group1, timestamp=1660547543992}]
Interestingly if I pass the partition key as event id then I fetch it correctly but not the payload itself.
I believe, the issue is with your DTO, where the value of the evenId is not setting/getting correctly and it is always returning the default value of int, which is 0 .
This is how it looks the auto-generated class out of your.avsc. So, we need to set/get eventId in your event class appropriately.
/** Gets the value of the 'eventId' field */
public java.lang.Integer getEventId() {
return eventId;
}
/** Sets the value of the 'eventId' field */
public com.example.schema.avro.Event.Builder setEventId(int value) {
validate(fields()[0], value);
this.eventId = value;
fieldSetFlags()[0] = true;
return this;
}
public Event build() {
try {
Event record = new Event();
record.eventId = fieldSetFlags()[0] ? this.eventId : (java.lang.Integer) defaultValue(fields()[0]);
return record;
} catch (Exception e) {
throw new org.apache.avro.AvroRuntimeException(e);
}
}
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.