简体   繁体   中英

Json Message Communication With Apache Kafka in Spring Boot

I have implemented two different applications one for producer and the other for consumer and messages have been passed with the support of apache kafka. When I publish a String message then the communication is done properly but when i pass Json message then the following error has occurred.

Error

java.lang.IllegalStateException: This error handler cannot process 'SerializationException's directly; please consider configuring an 'ErrorHandlingDeserializer' in the value and/or key deserializer

Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition Kafka_Example_Json-0 at offset 0. If needed, please seek past the record to continue consumption.

Caused by: java.lang.IllegalArgumentException: The class 'com.benz.kafka.api.model.User' is not in the trusted packages: [java.util, java.lang, com.benz.kafka.consumer.api.model, com.benz.kafka.consumer.api.model.*]. If you believe this class is safe to deserialize, please provide its name. If the serialization is only done by a trusted source, you can also enable trust all (*).

ConsumerConfig class

@Configuration
@EnableKafka
public class KafkaConfig {

private ConsumerFactory<String, User> userConsumerFactory()
    {
        Map<String,Object> config=new ConcurrentHashMap<>();

        config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,"127.0.0.1:9092");
        config.put(ConsumerConfig.GROUP_ID_CONFIG,"group_json");
        config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
        config.put(ErrorHandlingDeserializer.KEY_DESERIALIZER_CLASS, JsonDeserializer.class);
        config.put(ErrorHandlingDeserializer.VALUE_DESERIALIZER_CLASS,JsonDeserializer.class.getClass());
        config.put(JsonDeserializer.TRUSTED_PACKAGES,"*");

          return new DefaultKafkaConsumerFactory<>(config,new StringDeserializer(),new JsonDeserializer<>(User.class));
    }

 @Bean
    public ConcurrentKafkaListenerContainerFactory<String,User> userKafkaListenerContainerFactory()
    {
        ConcurrentKafkaListenerContainerFactory<String,User> factory
                =new ConcurrentKafkaListenerContainerFactory<>();

        factory.setConsumerFactory(userConsumerFactory());

        return factory;

    }
}

model

@NoArgsConstructor
@AllArgsConstructor
@Getter
@Setter
public class User {

    private int userId;
    private String userName;
    private double salary;

    @Override
    public String toString() {
        return "User{" +
                "userId=" + userId +
                ", userName='" + userName + '\'' +
                ", salary=" + salary +
                '}';
    }
}

ProducerConfig class

@Configuration
public class KafkaConfig {

    private ProducerFactory<String,User> producerFactory()
    {
        Map<String,Object> config=new ConcurrentHashMap<>();

          config.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,"127.0.0.1:9092");
          config.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
          config.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);

          return new DefaultKafkaProducerFactory<>(config);
    }

    @Bean
    public KafkaTemplate<String,User> kafkaTemplate()
    {
        return new KafkaTemplate<>(producerFactory());
    }


}

model

@NoArgsConstructor
@AllArgsConstructor
@Getter
@Setter
public class User {

    private int userId;
    private String userName;
    private double salary;

    @Override
    public String toString() {
        return "User{" +
                "userId=" + userId +
                ", userName='" + userName + '\'' +
                ", salary=" + salary +
                '}';
    }
}

You need to annotate it with @Bean and add this config:

    @Bean
    private ProducerFactory<String,User> producerFactory()
{
    Map<String,Object> config=new ConcurrentHashMap<>();

      config.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,"127.0.0.1:9092");
      config.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
      config.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);

      return new DefaultKafkaProducerFactory<>(config);
}

@Bean
    public Map<String, Object> producerConfigs() {
        Map<String, Object> props = new HashMap<>();
        props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
        props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
        return props;
    }

    @Bean
    public ProducerFactory<String, AccountEvent> producerFactory() {
        return new DefaultKafkaProducerFactory<>(producerConfigs());
    }

    @Bean
    public KafkaTemplate<String, AccountEvent> kafkaTemplate() {
        return new KafkaTemplate<>(producerFactory());
    }

If you want use a project with kafka infrastructure, I have it in my github. https://github.com/gabryellr/banking-system

This

config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);

should be

config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, ErrorHandlingDeserializer.class);

However, it's not clear why your trusted packages config is not being applied; suggest you set a breakpoint in the JsonDeserializer.configure() method.

EDIT

Oh...

return new DefaultKafkaConsumerFactory<>(config,new StringDeserializer(),new JsonDeserializer<>(User.class));

When you pass in a deserializer instance like that, the properties are not used; you have to construct and configure the deserializer fully yourself.

Since you want to use an ErrorHandlingDeserializer anyway, you should pass one of those in here instead.

Alternatively, change this to

return new DefaultKafkaConsumerFactory<>(config);

I have found the cause of why this error occurred. You can see under the error part model class is created in the two different packages. In the Producer, the model class is created under the com.benz.kafka.api.model package, and in the Consumer part, the model is created under the com.benz.kafka.consumer.api.model package. That is the root cause and I changed com.benz.kafka.consumer.api.model to com.benz.kafka.api.model then it is worked.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM