繁体   English   中英

发现异常.....org.apache.kafka.common.KafkaException:无法使用自定义对象序列化程序构建 kafka 生产者

[英]Exception found.....org.apache.kafka.common.KafkaException: Failed to construct kafka producer using custom object Serializer

我编写了一个 spring-kafka 包,使用 spring boot 将消息发送到 Kafka 主题,其中“Key”为字符串,“Arraylist”为值。 “自定义对象”是一个具有项目 ID、项目名称和项目订购计数属性的类。

Kafka Producer 日志如下所示。

2021-10-29 00:09:34.147  INFO 16496 --- [           main] o.a.k.clients.producer.ProducerConfig    : ProducerConfig values: 
    acks = 1
    batch.size = 16384
    bootstrap.servers = [172.26.77.192:9092]
    buffer.memory = 33554432
    client.dns.lookup = use_all_dns_ips
    client.id = producer-1
    compression.type = none
    connections.max.idle.ms = 540000
    delivery.timeout.ms = 120000
    enable.idempotence = false
    interceptor.classes = []
    internal.auto.downgrade.txn.commit = true
    key.serializer = class org.apache.kafka.common.serialization.StringSerializer
    linger.ms = 0
    max.block.ms = 60000
    max.in.flight.requests.per.connection = 5
    max.request.size = 1048576
    metadata.max.age.ms = 300000
    metadata.max.idle.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.recording.level = INFO
    metrics.sample.window.ms = 30000
    partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
    receive.buffer.bytes = 32768
    reconnect.backoff.max.ms = 1000
    reconnect.backoff.ms = 50
    request.timeout.ms = 30000
    retries = 2147483647
    retry.backoff.ms = 100
    sasl.client.callback.handler.class = null
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = null
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.login.callback.handler.class = null
    sasl.login.class = null
    sasl.login.refresh.buffer.seconds = 300
    sasl.login.refresh.min.period.seconds = 60
    sasl.login.refresh.window.factor = 0.8
    sasl.login.refresh.window.jitter = 0.05
    sasl.mechanism = GSSAPI
    security.protocol = PLAINTEXT
    security.providers = null
    send.buffer.bytes = 131072
    socket.connection.setup.timeout.max.ms = 127000
    socket.connection.setup.timeout.ms = 10000
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
    ssl.endpoint.identification.algorithm = https
    ssl.engine.factory.class = null
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.certificate.chain = null
    ssl.keystore.key = null
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLSv1.3
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.certificates = null
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
    transaction.timeout.ms = 60000
    transactional.id = null
    value.serializer = class io.springbootlearn.orders.customerorders.util.kafkaProducer.KafkaArrayListSerializer

2021-10-29 00:09:34.180  INFO 16496 --- [           main] o.a.k.clients.producer.KafkaProducer     : [Producer clientId=producer-1] Closing the Kafka producer with timeoutMillis = 0 ms.
2021-10-29 00:09:34.180  INFO 16496 --- [           main] org.apache.kafka.common.metrics.Metrics  : Metrics scheduler closed
2021-10-29 00:09:34.180  INFO 16496 --- [           main] org.apache.kafka.common.metrics.Metrics  : Closing reporter org.apache.kafka.common.metrics.JmxReporter
2021-10-29 00:09:34.181  INFO 16496 --- [           main] org.apache.kafka.common.metrics.Metrics  : Metrics reporters closed
2021-10-29 00:09:34.182  INFO 16496 --- [           main] o.a.kafka.common.utils.AppInfoParser     : App info kafka.producer for producer-1 unregistered
2021-10-29 00:09:34.182 DEBUG 16496 --- [           main] o.a.k.clients.producer.KafkaProducer     : [Producer clientId=producer-1] Kafka producer has been closed
ProcessOrders: Exception found.....org.apache.kafka.common.KafkaException: Failed to construct kafka producer

我编写了一个自定义序列化程序,如下所示。

package io.springbootlearn.orders.customerorders.util.kafkaProducer;

import java.io.ByteArrayOutputStream;
import java.io.DataOutputStream;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Iterator;

import org.apache.kafka.common.serialization.Serde;
import org.apache.kafka.common.serialization.Serializer;

import io.springbootlearn.orders.customerorders.models.UpdateItemCount;

public class KafkaArrayListSerializer<UpdateItemCount> implements Serializer<ArrayList<UpdateItemCount>>{
    
    private final Serializer<UpdateItemCount> arrayListSerializer;
    
    public KafkaArrayListSerializer(final Serializer<UpdateItemCount> paramArrayListSerializer) {
        System.out.println("KafkaArrayListSerializer: Inside constructor.......");
        this.arrayListSerializer = paramArrayListSerializer;
    }

    @Override
    public byte[] serialize(String topic, ArrayList<UpdateItemCount> customerOrderData) {
        System.out.println("KafkaArrayListSerializer: Inside serialize.......");
        System.out.println("Topic name......."+topic);
        
        int dataSize = customerOrderData.size();
        System.out.println("dataSize......."+dataSize);
        
        ByteArrayOutputStream baos = new ByteArrayOutputStream();
        DataOutputStream out = new DataOutputStream(baos);
        Iterator<UpdateItemCount> arrayListIter = customerOrderData.iterator();
        try {
            out.writeInt(dataSize);
            while (arrayListIter.hasNext()) {
                final byte[] customerOrderDataToBytes =  arrayListSerializer.serialize(topic, arrayListIter.next());
                out.writeInt(customerOrderDataToBytes.length);
                out.write(customerOrderDataToBytes);
            }
            out.close();
        } catch(IOException e) {
            throw new RuntimeException("unable to serialize ArrayList", e);
        }
        
        return baos.toByteArray();
    }
}

“Arraylist”的 Serde 类如下所示。

package io.springbootlearn.orders.customerorders.util.kafkaProducer;

import java.util.ArrayList;

import org.apache.kafka.common.serialization.Deserializer;
import org.apache.kafka.common.serialization.Serde;
import org.apache.kafka.common.serialization.Serdes;
import org.apache.kafka.common.serialization.Serializer;

public class ArrayListSerde<T> implements Serde<ArrayList<T>>{
    
    private final Serde<ArrayList<T>> arrayListSerdeObj;
    
    public ArrayListSerde(final Serde<T> SerdeObj) {
        this.arrayListSerdeObj = Serdes.serdeFrom(new KafkaArrayListSerializer<>(SerdeObj.serializer())
                                                 , new KafkaArrayListDeserializer<>(SerdeObj.deserializer()));
    }

    @Override
    public Serializer<ArrayList<T>> serializer() {
        System.out.println("ArrayListSerde: Inside Serializer.......");
        return arrayListSerdeObj.serializer();
    }

    @Override
    public Deserializer<ArrayList<T>> deserializer() {
        return null;
    }
    
    @Override
      public void close() {
        arrayListSerdeObj.serializer().close();
      }

}

Kafka Producer 服务代码如下所示:-

package io.springbootlearn.orders.customerorders.util.kafkaProducer;

import java.util.ArrayList;
import java.util.concurrent.ExecutionException;

import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.logging.log4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.kafka.annotation.EnableKafka;
import org.springframework.kafka.core.KafkaProducerException;
import org.springframework.kafka.core.KafkaSendCallback;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.support.ProducerListener;
import org.springframework.kafka.support.SendResult;
import org.springframework.stereotype.Service;
import org.springframework.util.concurrent.ListenableFuture;
import org.springframework.util.concurrent.ListenableFutureCallback;

import io.springbootlearn.orders.customerorders.models.UpdateItemCount;


@Service
@ConfigurationProperties
public class KafkaProducerService {
    
    
    private final KafkaTemplate<String, ArrayList<UpdateItemCount>> kafkaTemplateObj;
    private String KAFKA_TOPIC_NAME = "ItemsOrdered";
    private ListenableFuture<SendResult<String,ArrayList<UpdateItemCount>>> asyncCall;
    private ProducerListener<String, ArrayList<UpdateItemCount>> producerListener;
    
    public KafkaProducerService(KafkaTemplate<String,ArrayList<UpdateItemCount>> kafkaTemplateParam) {
        this.kafkaTemplateObj = kafkaTemplateParam;     
    }
    
    public KafkaTemplate<String, ArrayList<UpdateItemCount>> producerListener(ProducerListener<String, ArrayList<UpdateItemCount>> producerListener) {
        kafkaTemplateObj.setProducerListener(producerListener);
        return kafkaTemplateObj;
    }
    
    public void sendMessage(Integer PartitionId, String key, ArrayList<UpdateItemCount> UpdateItemCountArr) {
        
            try {

                System.out.println("KafkaProducerService:sending message");
                
                final ProducerRecord<String, ArrayList<UpdateItemCount>> record = new ProducerRecord<String, ArrayList<UpdateItemCount>>(KAFKA_TOPIC_NAME, PartitionId, key,UpdateItemCountArr);

                System.out.println("KafkaProducerService: sending async call...");
                
                ListenableFuture<SendResult<String,ArrayList<UpdateItemCount>>> future = kafkaTemplateObj.send(record);
                 
                kafkaTemplateObj.flush();
                
            } catch(KafkaProducerException ex) {
                System.out.println("Exception...." + ex);
                System.out.println("Exception msg...." + ex.getMessage());
                System.out.println("Exception root cause..." + ex.getRootCause());
                System.out.println("Exception stack trace..." + ex.getStackTrace());
            }
     }
    
}

kafka配置如下图

spring.kafka.producer.bootstrap-servers = 172.26.77.192:9092
spring.kafka.producer.key-serializer = org.apache.kafka.common.serialization.StringSerializer
spring.kafka.producer.value-serializer = io.springbootlearn.orders.customerorders.util.kafkaProducer.KafkaArrayListSerializer
logging.level.org.apache.kafka=debug
logging.level.org.apache.kafka.clients=debug

我已经通过编写另一个程序来确认我的 Kafka 设置工作正常,该程序将消息发送到不同的 Kafka 主题,并将“键”和“值”作为字符串。 消费者能够从这个主题中获取消息。

有人可以帮忙吗?

非常感谢。

Kafka 只能创建没有参数构造函数的序列化器。 对于更复杂的对象,您必须自己构造它们并通过构造函数或 setter 传递到默认的生产者工厂。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM