[英]Getting exception while sending avro message, exception is org.apache.kafka.common.errors.SerializationException: Error registering Avro schema:
我正在使用 spring 云 stream 并尝试发布 avro 消息但收到上述异常,
我有以下属性,
spring.cloud.stream.bindings.feed-output-channel.producer.useNativeEncoding=true
spring.cloud.stream.bindings.feed-output-channel.destination=TOPI.NAME
spring.cloud.stream.bindings.feed-output-channel.producer.partition-count=1
spring.cloud.stream.bindings.feed-output-channel.contentType=application/*+avro
spring.cloud.stream.bindings.feed-output-channel.producer.partition-key-expression=headers['kafka_messageKey']
spring.cloud.stream.kafka.bindings.feed-output-channel.producer.configuration.request.timeout.ms=60000
spring.cloud.stream.kafka.binder.producer-properties.key.serializer=org.apache.kafka.common.serialization.StringSerializer
spring.cloud.stream.kafka.binder.producer-properties.value.serializer=io.confluent.kafka.serializers.KafkaAvroSerializer
spring.cloud.stream.schema.avro.dynamicSchemaGenerationEnabled=true
spring.cloud.stream.kafka.binder.producer-properties.schema.registry.url=https://base-url:8081/
spring.cloud.stream.kafka.binder.brokers=g-vmx.com:9092,g-vmx.com:9092,g-vmx.com:9092
我还添加了 confluentSchema 注册表梁,有@EnableSchemaRegistryClient
注释 n 主要 class
@Bean
@Primary
public SchemaRegistryClient schemaRegistryClient() {
log.info("schema registry bean");
ConfluentSchemaRegistryClient client = new ConfluentSchemaRegistryClient();
client.setEndpoint(endPoint);
return client;
}
我推送消息的代码
public void publishFeed(String market, String sku) {
token.link();
MessageChannel messageChannel = resolveMessageChannel();
String messageKey = String.format("%s_%s", market, sku);
Payload Payload = buildPayload(market, sku);
Message<RequestPayload> message = MessageBuilder
.withPayload(payload)
.setHeader(KafkaHeaders.MESSAGE_KEY, messageKey)
.build();
log.info("publish requestPayload message {} ", message);
boolean sent = messageChannel.send(message, DEFAULT_TIMEOUT_TO_SEND_MESSAGE);
if (!sent) {
throw new MessagePublishException("Unable to send=" + message);
}
}
依赖
implementation 'org.springframework.cloud:spring-cloud-stream-schema:2.2.1.RELEASE'
implementation "io.confluent:kafka-avro-serializer:5.3.0"
implementation "org.apache.avro:avro:1.10.1"
在这个messageChannel.send(message, DEFAULT_TIMEOUT_TO_SEND_MESSAGE);
行它抛出异常。
org.springframework.messaging.MessageHandlingException: error occurred in message handler [org.springframework.cloud.stream.binder.kafka.KafkaMessageChannelBinder$ProducerConfigurationMessageHandler@1dc8da99]; nested exception is org.apache.kafka.common.errors.SerializationException: Error registering Avro schema:
无法弄清楚为什么会引发此异常。
我能够成功地以 AVRO 格式发布消息,但上述配置和属性足以以 AVRO 格式发布消息。 它在其他应用程序中工作,但不在我的应用程序中。
所以我添加了以下属性。 之后就成功了
spring.cloud.stream.kafka.binder.configuration.basic.auth.credentials.source=USER_INFO
spring.cloud.stream.kafka.binder.configuration.basic.auth.user.info=
spring.cloud.stream.kafka.binder.producer-properties.schema.registry.ssl.truststore.password=
spring.cloud.stream.kafka.binder.producer-properties.schema.registry.ssl.truststore.location=
spring.cloud.stream.kafka.binder.producer-properties.schema.registry.ssl.keystore.location=
spring.cloud.stream.kafka.binder.producer-properties.schema.registry.ssl.keystore.password=
spring.cloud.stream.kafka.binder.producer-properties.schema.registry.ssl.key.password=
spring.cloud.stream.kafka.binder.producer-properties.basic.auth.credentials.source=USER_INFO
spring.cloud.stream.kafka.binder.producer-properties.basic.auth.user.info=
spring.cloud.stream.kafka.binder.producer-properties.schema.registry.auto.register.schemas=false
虽然我已经在以下属性下定义了证书信息,但它不适用于这些属性,必须添加上述属性
spring.cloud.stream.kafka.binder.configuration.ssl.keystore.location
spring.cloud.stream.kafka.binder.configuration.ssl.truststore.location
with other properties
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.