简体   繁体   English

kafka 流异常找不到 org.apache.kafka.common.serialization.Serdes$WrapperSerde 的公共无参数构造函数

[英]kafka streams exception Could not find a public no-argument constructor for org.apache.kafka.common.serialization.Serdes$WrapperSerde

getting the below error stack trace while working with kafka streams在使用 kafka 流时获得以下错误堆栈跟踪

UPDATE: as per @matthias-j-sax, have implemented my own Serdes with default constructor for WrapperSerde but still getting the following exceptions更新:按@马蒂亚斯-J-萨克斯,已经实现了我自己Serdes与默认构造函数WrapperSerde但仍然得到以下情况除外

org.apache.kafka.streams.errors.StreamsException: stream-thread [streams-request-count-4c239508-6abe-4901-bd56-d53987494770-StreamThread-1] Failed to rebalance.
    at org.apache.kafka.streams.processor.internals.StreamThread.pollRequests (StreamThread.java:836)
    at org.apache.kafka.streams.processor.internals.StreamThread.runOnce (StreamThread.java:784)
    at org.apache.kafka.streams.processor.internals.StreamThread.runLoop (StreamThread.java:750)
    at org.apache.kafka.streams.processor.internals.StreamThread.run (StreamThread.java:720)
Caused by: org.apache.kafka.streams.errors.StreamsException: Failed to configure value serde class myapps.serializer.Serdes$WrapperSerde
    at org.apache.kafka.streams.StreamsConfig.defaultValueSerde (StreamsConfig.java:972)
    at org.apache.kafka.streams.processor.internals.AbstractProcessorContext.<init> (AbstractProcessorContext.java:59)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.<init> (ProcessorContextImpl.java:42)
    at org.apache.kafka.streams.processor.internals.StreamTask.<init> (StreamTask.java:136)
    at org.apache.kafka.streams.processor.internals.StreamThread$TaskCreator.createTask (StreamThread.java:405)
    at org.apache.kafka.streams.processor.internals.StreamThread$TaskCreator.createTask (StreamThread.java:369)
    at org.apache.kafka.streams.processor.internals.StreamThread$AbstractTaskCreator.createTasks (StreamThread.java:354)
    at org.apache.kafka.streams.processor.internals.TaskManager.addStreamTasks (TaskManager.java:148)
    at org.apache.kafka.streams.processor.internals.TaskManager.createTasks (TaskManager.java:107)
    at org.apache.kafka.streams.processor.internals.StreamThread$RebalanceListener.onPartitionsAssigned (StreamThread.java:260)
    at org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.onJoinComplete (ConsumerCoordinator.java:259)
    at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.joinGroupIfNeeded (AbstractCoordinator.java:367)
    at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.ensureActiveGroup (AbstractCoordinator.java:316)
    at org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.poll (ConsumerCoordinator.java:290)
    at org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce (KafkaConsumer.java:1149)
    at org.apache.kafka.clients.consumer.KafkaConsumer.poll (KafkaConsumer.java:1115)
    at org.apache.kafka.streams.processor.internals.StreamThread.pollRequests (StreamThread.java:827)
    at org.apache.kafka.streams.processor.internals.StreamThread.runOnce (StreamThread.java:784)
    at org.apache.kafka.streams.processor.internals.StreamThread.runLoop (StreamThread.java:750)
    at org.apache.kafka.streams.processor.internals.StreamThread.run (StreamThread.java:720)
Caused by: java.lang.NullPointerException
    at myapps.serializer.Serdes$WrapperSerde.configure (Serdes.java:30)
    at org.apache.kafka.streams.StreamsConfig.defaultValueSerde (StreamsConfig.java:968)
    at org.apache.kafka.streams.processor.internals.AbstractProcessorContext.<init> (AbstractProcessorContext.java:59)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.<init> (ProcessorContextImpl.java:42)
    at org.apache.kafka.streams.processor.internals.StreamTask.<init> (StreamTask.java:136)
    at org.apache.kafka.streams.processor.internals.StreamThread$TaskCreator.createTask (StreamThread.java:405)
    at org.apache.kafka.streams.processor.internals.StreamThread$TaskCreator.createTask (StreamThread.java:369)
    at org.apache.kafka.streams.processor.internals.StreamThread$AbstractTaskCreator.createTasks (StreamThread.java:354)
    at org.apache.kafka.streams.processor.internals.TaskManager.addStreamTasks (TaskManager.java:148)
    at org.apache.kafka.streams.processor.internals.TaskManager.createTasks (TaskManager.java:107)
    at org.apache.kafka.streams.processor.internals.StreamThread$RebalanceListener.onPartitionsAssigned (StreamThread.java:260)
    at org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.onJoinComplete (ConsumerCoordinator.java:259)
    at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.joinGroupIfNeeded (AbstractCoordinator.java:367)
    at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.ensureActiveGroup (AbstractCoordinator.java:316)
    at org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.poll (ConsumerCoordinator.java:290)
    at org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce (KafkaConsumer.java:1149)
    at org.apache.kafka.clients.consumer.KafkaConsumer.poll (KafkaConsumer.java:1115)
    at org.apache.kafka.streams.processor.internals.StreamThread.pollRequests (StreamThread.java:827)
    at org.apache.kafka.streams.processor.internals.StreamThread.runOnce (StreamThread.java:784)
    at org.apache.kafka.streams.processor.internals.StreamThread.runLoop (StreamThread.java:750)
    at org.apache.kafka.streams.processor.internals.StreamThread.run (StreamThread.java:720)

Here's my usecase:这是我的用例:

I will be getting json responses as input to the stream, I want to count requests whose status codes are not 200. Initially, I went through the documentation of kafka streams in official documentation as well as confluent, then implemented WordCountDemo which is working very fine, then I tried to wrote this code, but getting this exception, I am very new to kafka streams, I went through the stack trace, but couldn't understood the context, hence came here for help!!!我将得到 json 响应作为流的输入,我想计算状态代码不是 200 的请求。最初,我浏览了官方文档中的 kafka 流文档以及 confluent,然后实现了WordCountDemo ,它工作得很好,然后我尝试编写此代码,但收到此异常,我对 kafka 流很陌生,我浏览了堆栈跟踪,但无法理解上下文,因此来到这里寻求帮助!!!

Here's my code这是我的代码

LogCount.java

package myapps;

import java.util.Properties;
import java.util.concurrent.CountDownLatch;

import org.apache.kafka.common.serialization.Serde;
import org.apache.kafka.common.serialization.Serdes;
import org.apache.kafka.streams.KafkaStreams;
import org.apache.kafka.streams.StreamsBuilder;
import org.apache.kafka.streams.StreamsConfig;
import org.apache.kafka.streams.Topology;
import org.apache.kafka.streams.kstream.KStream;
import org.apache.kafka.streams.kstream.Produced;
import myapps.serializer.JsonDeserializer;
import myapps.serializer.JsonSerializer;
import myapps.Request;


public class LogCount {

    public static void main(String[] args) {
        Properties props = new Properties();
        props.put(StreamsConfig.APPLICATION_ID_CONFIG, "streams-request-count");
        props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
        JsonSerializer<Request> requestJsonSerializer = new JsonSerializer<>();
        JsonDeserializer<Request> requestJsonDeserializer = new JsonDeserializer<>(Request.class);
        Serde<Request> requestSerde = Serdes.serdeFrom(requestJsonSerializer, requestJsonDeserializer);
        props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
        props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, requestSerde.getClass().getName());
        final StreamsBuilder builder = new StreamsBuilder();

        KStream<String, Request> source = builder.stream("streams-requests-input");
        source.filter((k, v) -> v.getHttpStatusCode() != 200)
                .groupByKey()
                .count()
                .toStream()
                .to("streams-requests-output", Produced.with(Serdes.String(), Serdes.Long()));
        final Topology topology = builder.build();
        final KafkaStreams streams = new KafkaStreams(topology, props);
        final CountDownLatch latch = new CountDownLatch(1);

        System.out.println(topology.describe());
        // attach shutdown handler to catch control-c
        Runtime.getRuntime().addShutdownHook(new Thread("streams-shutdown-hook") {
            @Override
            public void run() {
                streams.close();
                latch.countDown();
            }
        });

        try {
            streams.cleanUp();
            streams.start();
            latch.await();
        } catch (Throwable e) {
            System.exit(1);
        }
        System.exit(0);
    }
}

JsonDeserializer.java

package myapps.serializer;

import com.google.gson.Gson;
import org.apache.kafka.common.serialization.Deserializer;
import java.util.Map;

public class JsonDeserializer<T> implements Deserializer<T> {

    private Gson gson = new Gson();
    private Class<T> deserializedClass;

    public JsonDeserializer(Class<T> deserializedClass) {
        this.deserializedClass = deserializedClass;
    }

    public JsonDeserializer() {
    }

    @Override
    @SuppressWarnings("unchecked")
    public void configure(Map<String, ?> map, boolean b) {
        if(deserializedClass == null) {
            deserializedClass = (Class<T>) map.get("serializedClass");
        }
    }

    @Override
    public T deserialize(String s, byte[] bytes) {
         if(bytes == null){
             return null;
         }

         return gson.fromJson(new String(bytes),deserializedClass);

    }

    @Override
    public void close() {

    }
}

JsonSerializer.java

package myapps.serializer;

import com.google.gson.Gson;
import org.apache.kafka.common.serialization.Serializer;

import java.nio.charset.Charset;
import java.util.Map;

public class JsonSerializer<T> implements Serializer<T> {

    private Gson gson = new Gson();

    @Override
    public void configure(Map<String, ?> map, boolean b) {

    }

    @Override
    public byte[] serialize(String topic, T t) {
        return gson.toJson(t).getBytes(Charset.forName("UTF-8"));
    }

    @Override
    public void close() {

    }
}

As I mentioned, I will be getting JSON as input, the structure is like this,正如我提到的,我将获得 JSON 作为输入,结构是这样的,

{ {
"RequestID":"1f6b2409", "Protocol":"http", "Host":"abc.com", "Method":"GET", "HTTPStatusCode":"200", "User-Agent":"curl%2f7.54.0", } "RequestID":"1f6b2409", "Protocol":"http", "Host":"abc.com", "Method":"GET", "HTTPStatusCode":"200", "User-Agent":"curl %2f7.54.0", }

The corresponding Request.java file looks like this对应的Request.java文件是这样的

package myapps;

public final class Request {
    private String requestID;
    private String protocol;
    private String host;
    private String method;
    private int httpStatusCode;
    private String userAgent;

    public String getRequestID() {
        return requestID;
    }
    public void setRequestID(String requestID) {
        this.requestID = requestID;
    }
    public String getProtocol() {
        return protocol;
    }
    public void setProtocol(String protocol) {
        this.protocol = protocol;
    }
    public String getHost() {
        return host;
    }
    public void setHost(String host) {
        this.host = host;
    }
    public String getMethod() {
        return method;
    }
    public void setMethod(String method) {
        this.method = method;
    }
    public int getHttpStatusCode() {
        return httpStatusCode;
    }
    public void setHttpStatusCode(int httpStatusCode) {
        this.httpStatusCode = httpStatusCode;
    }
    public String getUserAgent() {
        return userAgent;
    }
    public void setUserAgent(String userAgent) {
        this.userAgent = userAgent;
    }
}

EDIT: when I exit from kafka-console-consumer.sh , it's saying Processed a total of 0 messages .编辑:当我从kafka-console-consumer.sh退出时,它说Processed a total of 0 messages

As the error indicate, a class is missing a non-argument default constructor for Serdes$WrapperSerde :如错误所示,类缺少Serdes$WrapperSerde的非参数默认构造Serdes$WrapperSerde

Could not find a public no-argument constructor 

The issue is this construct:问题是这个结构:

Serde<Request> requestSerde = Serdes.serdeFrom(requestJsonSerializer, requestJsonDeserializer);
props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, requestSerde.getClass().getName());

Serdes.serdeFrom return WrapperSerde that does not have an empty default constructor. Serdes.serdeFrom返回没有空默认构造函数的WrapperSerde Thus, you cannot pass it into the StreamsConfig .因此,您不能将其传递到StreamsConfig You can use Serdes generate like this only if you pass objects into the corresponding API calls (ie, overwrite default Serde for certain operators).只有将对象传递给相应的 API 调用(即,覆盖某些运算符的默认Serde )时,您才能像这样使用Serdes生成。

To make it work (ie, to be able to set the Serde in the config), you would need to implement a proper class that implement Serde interface.为了使其工作(即,能够在配置中设置 Serde),您需要实现一个适当的类来实现Serde接口。

The requestSerde.getClass().getName() did not work for me. requestSerde.getClass().getName()对我不起作用。 I needed to provide my own WrapperSerde implementation in an inner class.我需要在内部类中提供我自己的WrapperSerde实现。 You probably need to do the same with something like:您可能需要对以下内容执行相同的操作:

public class MySerde extends WrapperSerde<Request> {
    public MySerde () {
        super(requestJsonSerializer, requestJsonDeserializer);
    }
}

不是在属性中指定,而是在流创建中添加自定义 serde

 KStream<String, Request> source = builder.stream("streams-requests-input",Consumed.with(Serdes.String(), requestSerde));

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 NoClassDefFoundError:org/apache/kafka/common/serialization/StringDeserializer - NoClassDefFoundError: org/apache/kafka/common/serialization/StringDeserializer org.apache.kafka.common.KafkaException:SaleRequestFactory 类不是 org.apache.kafka.common.serialization.Serializer 的实例 - org.apache.kafka.common.KafkaException: class SaleRequestFactory is not an instance of org.apache.kafka.common.serialization.Serializer 为 Kafka 流配置 Serdes 的问题 - Issue with configuring Serdes for Kafka Streams Kafka Streams - 自定义对象的 Serdes - Kafka Streams - Serdes for Custom Objects Spark因org.apache.kafka.common.serialization.StringDeserializer的NoClassDefFoundError而失败 - Spark fails with NoClassDefFoundError for org.apache.kafka.common.serialization.StringDeserializer KafkaException:class 不是 org.apache.kafka.common.serialization.Deserializer 的实例 - KafkaException: class is not an instance of org.apache.kafka.common.serialization.Deserializer Kafka 流异常:org.apache.kafka.streams.errors.StreamsException - 反序列化异常处理程序 - Kafka streams Exception: org.apache.kafka.streams.errors.StreamsException - Deserialization exception handler 线程“main”org.apache.kafka.streams.errors.InvalidStateStoreException 中的异常: - Exception in thread "main" org.apache.kafka.streams.errors.InvalidStateStoreException: Spring Cloud Stream Kafka-找不到Serde类:org.apache.kafka.common.serialization.Serde $ StringSerde - Spring Cloud Stream Kafka - Serde class not found: org.apache.kafka.common.serialization.Serde$StringSerde 在Kafka流上为窗口数据创建SerDes - Creating a SerDes for Windowed Data on Kafka Streams
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM