簡體   English   中英

Spring Kafka class 不在受信任的軟件包中

[英]Spring Kafka The class is not in the trusted packages

在庫更新之前的 Spring Boot/Kafka 應用程序中,我使用了以下 class org.telegram.telegrambots.api.objects.Update以將消息發布到Kafka Kafka 現在我使用以下org.telegram.telegrambots.meta.api.objects.Update 如您所見 - 他們有不同的包裹。

應用程序重新啟動后,我遇到了以下問題:

[org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.s.kafka.listener.LoggingErrorHandler : Error while processing: null

org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition telegram.fenix.bot.update-0 at offset 4223. If needed, please seek past the record to continue consumption.
Caused by: java.lang.IllegalArgumentException: The class 'org.telegram.telegrambots.api.objects.Update' is not in the trusted packages: [java.util, java.lang, org.telegram.telegrambots.meta.api.objects]. If you believe this class is safe to deserialize, please provide its name. If the serialization is only done by a trusted source, you can also enable trust all (*).
at org.springframework.kafka.support.converter.DefaultJackson2JavaTypeMapper.getClassIdType(DefaultJackson2JavaTypeMapper.java:139) ~[spring-kafka-2.1.8.RELEASE.jar!/:2.1.8.RELEASE]
at org.springframework.kafka.support.converter.DefaultJackson2JavaTypeMapper.toJavaType(DefaultJackson2JavaTypeMapper.java:113) ~[spring-kafka-2.1.8.RELEASE.jar!/:2.1.8.RELEASE]
at org.springframework.kafka.support.serializer.JsonDeserializer.deserialize(JsonDeserializer.java:221) ~[spring-kafka-2.1.8.RELEASE.jar!/:2.1.8.RELEASE]
at org.apache.kafka.clients.consumer.internals.Fetcher.parseRecord(Fetcher.java:967) ~[kafka-clients-1.1.0.jar!/:na]
at org.apache.kafka.clients.consumer.internals.Fetcher.access$3300(Fetcher.java:93) ~[kafka-clients-1.1.0.jar!/:na]
at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.fetchRecords(Fetcher.java:1144) ~[kafka-clients-1.1.0.jar!/:na]
at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.access$1400(Fetcher.java:993) ~[kafka-clients-1.1.0.jar!/:na]
at org.apache.kafka.clients.consumer.internals.Fetcher.fetchRecords(Fetcher.java:527) ~[kafka-clients-1.1.0.jar!/:na]
at org.apache.kafka.clients.consumer.internals.Fetcher.fetchedRecords(Fetcher.java:488) ~[kafka-clients-1.1.0.jar!/:na]
at org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce(KafkaConsumer.java:1155) ~[kafka-clients-1.1.0.jar!/:na]
at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1115) ~[kafka-clients-1.1.0.jar!/:na]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:699) ~[spring-kafka-2.1.8.RELEASE.jar!/:2.1.8.RELEASE]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_171]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_171]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_171]

這是我的配置:

@EnableAsync
@Configuration
public class ApplicationConfig {

    @Bean
    public StringJsonMessageConverter jsonConverter() {
        return new StringJsonMessageConverter();
    }

}

@Configuration
public class KafkaProducerConfig {

    @Value("${spring.kafka.bootstrap-servers}")
    private String bootstrapServers;

    @Bean
    public Map<String, Object> producerConfigs() {

        Map<String, Object> props = new HashMap<>();

        props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
        props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
        props.put(ProducerConfig.MAX_REQUEST_SIZE_CONFIG, 15000000);

        return props;
    }

    @Bean
    public ProducerFactory<String, Update> updateProducerFactory() {
        return new DefaultKafkaProducerFactory<>(producerConfigs());
    }

    @Bean
    public KafkaTemplate<String, Update> updateKafkaTemplate() {
        return new KafkaTemplate<>(updateProducerFactory());
    }

}

@Configuration
public class KafkaConsumerConfig {

    @Value("${kafka.consumer.max.poll.interval.ms}")
    private String kafkaConsumerMaxPollIntervalMs;

    @Value("${kafka.consumer.max.poll.records}")
    private String kafkaConsumerMaxPollRecords;

    @Value("${kafka.topic.telegram.fenix.bot.update.consumer.concurrency}")
    private Integer updateConsumerConcurrency;

    @Bean
    public ConsumerFactory<String, String> consumerFactory(KafkaProperties kafkaProperties) {
        return new DefaultKafkaConsumerFactory<>(kafkaProperties.buildConsumerProperties(), new StringDeserializer(), new JsonDeserializer<>(String.class));
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory(KafkaProperties kafkaProperties) {

        kafkaProperties.getProperties().put(ConsumerConfig.MAX_POLL_INTERVAL_MS_CONFIG, kafkaConsumerMaxPollIntervalMs);
        kafkaProperties.getProperties().put(ConsumerConfig.MAX_POLL_RECORDS_CONFIG, kafkaConsumerMaxPollRecords);

        ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory<>();
        factory.getContainerProperties().setAckMode(AckMode.MANUAL_IMMEDIATE);
        factory.setConsumerFactory(consumerFactory(kafkaProperties));

        return factory;
    }

    @Bean
    public ConsumerFactory<String, Update> updateConsumerFactory(KafkaProperties kafkaProperties) {
        return new DefaultKafkaConsumerFactory<>(kafkaProperties.buildConsumerProperties(), new StringDeserializer(), new JsonDeserializer<>(Update.class));
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, Update> updateKafkaListenerContainerFactory(KafkaProperties kafkaProperties) {

        kafkaProperties.getProperties().put(ConsumerConfig.MAX_POLL_INTERVAL_MS_CONFIG, kafkaConsumerMaxPollIntervalMs);
        kafkaProperties.getProperties().put(ConsumerConfig.MAX_POLL_RECORDS_CONFIG, kafkaConsumerMaxPollRecords);

        ConcurrentKafkaListenerContainerFactory<String, Update> factory = new ConcurrentKafkaListenerContainerFactory<>();
        factory.getContainerProperties().setAckMode(AckMode.MANUAL_IMMEDIATE);
        factory.setConsumerFactory(updateConsumerFactory(kafkaProperties));
        factory.setConcurrency(updateConsumerConcurrency);

        return factory;
    }

}

應用程序屬性

spring.kafka.bootstrap-servers=${kafka.host}:${kafka.port}
spring.kafka.consumer.auto-offset-reset=earliest
spring.kafka.consumer.group-id=postfenix
spring.kafka.consumer.enable-auto-commit=false
spring.kafka.producer.value-serializer=org.springframework.kafka.support.serializer.JsonSerializer

如何解決這個問題,讓 Kafka 將舊消息反序列化為新消息?

更新

這是我的聽眾

@Component
public class UpdateConsumer {

    @KafkaListener(topics = "${kafka.topic.update}", containerFactory = "updateKafkaListenerContainerFactory")
    public void onUpdateReceived(ConsumerRecord<String, Update> consumerRecord, Acknowledgment ack) {

        //do some logic here

        ack.acknowledge();
    }

}

請參閱 文檔

從 2.1 版開始,類型信息可以在記錄頭中傳達,允許處理多種類型。 此外,可以使用 Kafka 屬性配置序列化器/反序列化器。

JsonSerializer.ADD_TYPE_INFO_HEADERS(默認為真); 設置為 false 以禁用 JsonSerializer 上的此功能(設置 addTypeInfo 屬性)。

JsonDeserializer.KEY_DEFAULT_TYPE; 如果不存在標頭信息,則用於密鑰反序列化的備用類型。

JsonDeserializer.VALUE_DEFAULT_TYPE; 如果不存在標頭信息,則用於反序列化值的回退類型。

JsonDeserializer.TRUSTED_PACKAGES (默認 java.util, java.lang); 允許反序列化的包模式的逗號分隔列表; * 表示反序列化所有。

默認情況下,序列化程序會將類型信息添加到標頭中。

請參閱引導文檔

同樣,您可以禁用 JsonSerializer 在標頭中發送類型信息的默認行為:

 spring.kafka.producer.value-serializer=org.springframework.kafka.support.serializer.JsonSerializer spring.kafka.producer.properties.spring.json.add.type.headers=false

或者,您可以將類型映射添加到入站消息轉換器,以將源類型映射到目標類型。

編輯

說了這么多,你用的是什么版本?

有兩個關鍵點值得一提。

  1. 生產者和消費者有兩個獨立的項目。
  2. 然后發送消息(值)是一個對象類型而不是原始類型。

問題是生產消息對象在消費者端不可用,因為它們是兩個獨立的項目。

兩個克服了這個問題,請按照下面提到的 Spring boot Producer 和 Consumer 應用程序中的步驟操作。

----生產者應用程序 -------------

** 生產者配置類 **

import com.kafka.producer.models.Container;    
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.common.serialization.StringSerializer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.core.DefaultKafkaProducerFactory;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.core.ProducerFactory;
import org.springframework.kafka.support.serializer.JsonSerializer;
import java.util.HashMap;
import java.util.Map;

@Configuration
public class KafkaProducerConfig {

@Bean
public ProducerFactory<String, Container> producerFactory(){

    Map<String, Object> config = new HashMap<>();

config.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
config.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
config.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);

    return new DefaultKafkaProducerFactory(config);
}

@Bean
public KafkaTemplate<String, Container> kafkaTemplate(){
    return new KafkaTemplate<>(producerFactory());
}
}

注意:容器是要在 kafka 主題中發布的自定義對象。


** 生產者類 **

import com.kafka.producer.models.Container;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.support.KafkaHeaders;
import org.springframework.messaging.Message;
import org.springframework.messaging.support.MessageBuilder;
import org.springframework.stereotype.Service;

@Service
public class Producer {

private static final Logger LOGGER = LoggerFactory.getLogger(Producer.class);
private static final String TOPIC = "final-topic";

@Autowired
private KafkaTemplate<String, Container> kafkaTemplate;

public void sendUserMessage(Container msg) {
    LOGGER.info(String.format("\n ===== Producing message in JSON ===== \n"+msg));
    Message<Container> message = MessageBuilder
            .withPayload(msg)
            .setHeader(KafkaHeaders.TOPIC, TOPIC)
            .build();
    this.kafkaTemplate.send(message);
}
}

** 生產者控制器 **

import com.kafka.producer.models.Container;
import com.kafka.producer.services.Producer;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.*;

@RestController
@RequestMapping("/message")
public class MessageController {

@Autowired
private Producer producer;

@PostMapping(value = "/publish")
public String sendMessageToKafkaTopic(@RequestBody Container containerMsg) {
    this.producer.sendUserMessage(containerMsg);
    return "Successfully Published !!";
}
}

注意: Container 類型的消息將作為 JSON 消息發布到 kafka 主題名稱 :final-topic。

==================================================== ==============================

-- 消費者應用 --

** 配置類 **

import com.kafka.consumer.models.Container;
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.annotation.EnableKafka;
import org.springframework.kafka.config.ConcurrentKafkaListenerContainerFactory;
import org.springframework.kafka.core.ConsumerFactory;
import org.springframework.kafka.core.DefaultKafkaConsumerFactory;
import org.springframework.kafka.support.serializer.JsonDeserializer;
import java.util.HashMap;
import java.util.Map;

@Configuration
@EnableKafka
public class KafkaConsumerOneConfig {

@Bean
public ConsumerFactory<String, Container> consumerFactory(){
    JsonDeserializer<Container> deserializer = new JsonDeserializer<>(Container.class);
    deserializer.setRemoveTypeHeaders(false);
    deserializer.addTrustedPackages("*");
    deserializer.setUseTypeMapperForKey(true);

    Map<String, Object> config = new HashMap<>();

    config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
    config.put(ConsumerConfig.GROUP_ID_CONFIG, "group_one");
    config.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
    config.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
    config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
    config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, deserializer);

    return new DefaultKafkaConsumerFactory<>(config, new StringDeserializer(), deserializer);
}

@Bean
public ConcurrentKafkaListenerContainerFactory<String, Container> kafkaListenerContainerFactory(){
    ConcurrentKafkaListenerContainerFactory<String, Container> factory = new ConcurrentKafkaListenerContainerFactory<>();
    factory.setConsumerFactory(consumerFactory());
    return factory;

}
}

注意:在這里您可以看到,我們必須使用自定義 JsonDeserializer 來使用來自 final-topic(主題名稱)的容器對象類型 Json 消息,而不是使用默認的 JsonDeserializer()。


** 消費者服務 **

import com.kafka.consumer.models.Container;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.messaging.MessageHeaders;
import org.springframework.messaging.handler.annotation.Headers;
import org.springframework.messaging.handler.annotation.Payload;
import org.springframework.stereotype.Service;
import java.io.IOException;

@Service
public class ConsumerOne {

private final Logger LOGGER = LoggerFactory.getLogger(ConsumerOne.class);

@KafkaListener(topics = "final-topic", groupId = "group_one", containerFactory = "kafkaListenerContainerFactory")
public void consumeUserMessage(@Payload Container msg, @Headers MessageHeaders headers) throws IOException {
    System.out.println("received data in Consumer One ="+ msg.getMessageTypes());
}
}

對於這一點,有兩種方法可以做到這一點,要么在你的反序列化器中,要么在你的 application.yml 中。

在你的解串器中

在您的反序列化器中,您在DefaultKafkaConsumerFactory中使用(用於創建您的消費者工廠)。 假設您想制作一個ConsumerFactory<String, Foo> ,其中Foo是您想要用於您的 kafka 消息的模型/POJO。

您需要從addTrustedPackages JsonDeserializer我在 Kotlin 中有一個示例,但它與 java 中的語法幾乎相同:

 val deserializer = JsonDeserializer<Foo>()
 deserializer.addTrustedPackages("com.example.entity.Foo") // Adding Foo to our trusted packages

 val consumerFactory = DefaultKafkaConsumerFactory(
      consumerConfigs(),  // your consumer config
      StringDeserializer(), 
      deserializer // Using our newly created deserializer
 )

或者在你的 application.yml

在您的 application.yml 文件中遵循spring-kafka說明。 我們在受信任的存儲中添加來自com.example.entity.Foo包的 Foo 類,使用:

spring:
  kafka:
    consumer:
      properties:
        spring.json.trusted.packages: "com.example.entity.Foo"

使用spring.json.trusted.packages接受一系列包。 您可以指定一個類包,或使用*表示任何包。 在這種情況下,您不需要僅在消費者配置中傳遞DefaultKafkaConsumerFactory()中的deserializer化器。

jsonDeserializer.addTrustedPackages("*");

解決了我的 spring-kafka-2.2.8 問題。


將其添加到application.properties

spring.kafka.consumer.value-deserializer=org.springframework.kafka.support.serializer.JsonDeserializer
spring.kafka.consumer.properties.spring.json.trusted.packages=*

重要的提示:

如果您分別為 KafkaConsumer 和 KafkaProducer 提供了 Serializer 和 Deserializer 實例,它們將不起作用。

參考:
[1] https://docs.spring.io/spring-kafka/reference/html/#json-serde
[2] https://github.com/spring-projects/spring-kafka/issues/535

我也遇到過這個問題,但是上面的解決方案對我不起作用。 但是,訣竅是按如下方式配置 kafka 消費者工廠:

props.put(JsonDeserializer.TRUSTED_PACKAGES, "your.package.name");

對於那些在使用時遇到這個問題的人,您必須為消費者和流指定受信任的包,因為 Kafka 初始化了 2 個不同的反序列化器

spring:
  kafka:
    bootstrap-servers: localhost:9092
  producer:
    value-serializer: org.springframework.kafka.support.serializer.JsonSerializer

  consumer:
...
    properties:
       spring.json.trusted.packages:  "YOUR ENTTIES PACKAGE"
  streams:
...
    properties:
      spring.json.trusted.packages: "YOUR ENTTIES PACKAGE"

當我想在消費者應用程序中使用消息時遇到類似的問題,我收到 2 個錯誤:

1- The class 'someClass' is not in the trusted packages: [java.util, java.lang,If you believe this class is safe to deserialize, please provide its name. If the serialization is only done by a trusted source, you can also enable trust all (*) The class 'someClass' is not in the trusted packages: [java.util, java.lang,If you believe this class is safe to deserialize, please provide its name. If the serialization is only done by a trusted source, you can also enable trust all (*)

2- org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition partion at offset 9902. If needed, please seek past the record to continue consumption.

我可以通過將此屬性添加到KafkaConfig類的kafka消費者配置生成器方法(makeConfig)中來解決這個問題,以便使用這種方法使用配置,我的問題已經解決:

private Map<String, Object> makeConfig(ServiceMessagePriority input)
{
    Map<String, Object> configProps = new HashMap<>();
    configProps.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
    configProps.put(ErrorHandlingDeserializer.KEY_DESERIALIZER_CLASS, ErrorHandlingDeserializer.class);
    configProps.put(ErrorHandlingDeserializer.VALUE_DESERIALIZER_CLASS, ErrorHandlingDeserializer.class);
    configProps.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
    configProps.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
    configProps.put(JsonDeserializer.VALUE_DEFAULT_TYPE, "com.core.model.ServiceMsgDTO");
    configProps.put(JsonDeserializer.USE_TYPE_INFO_HEADERS,false);
    configProps.put(JsonDeserializer.TRUSTED_PACKAGES, "*");
return configProps;
}

我的 spring-kafka 版本是 2.2.11,我也有這個錯誤。

我收到此錯誤是因為我在同一個 kafta 主題中使用不同的配置配置了兩個消費者。 其中一個有 ConsumerFactory<String, OrderDTO>,另一個有 ConsumerFactory<String, String>。

我解決了更改一個消費者配置的錯誤,因為這是錯誤的。

只需檢查主題的消費者

有兩種方法可以解決問題:

  1. 在生產者中禁用類型標頭
  2. 在消費者中添加受信任的生產者列表

在生產者中禁用類型標頭

您的生產者屬性文件應如下所示,

spring:
  profiles: dev
  kafka:
    producer:
      bootstrap-servers: localhost:9092, localhost:9093
      key-serializer: org.apache.kafka.common.serialization.IntegerSerializer
      value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
      properties:
        acks: 1
        spring:
          json:
            add:
              type:
                headers: false

您的消費者屬性文件應如下所示,

spring:
  profiles: dev
  kafka:
    consumer:
      bootstrap-servers: localhost:9092
      key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      value-deserializer: org.springframework.kafka.support.serializer.JsonDeserializer
      group-id: consumer-group-1
      properties:
        spring:
          json:
            value:
              default:
                type: 'com.kafka.consumer.Message'

在消費者中添加受信任的生產者列表

如果您不想更改生產者屬性文件中的任何內容,請按如下方式更新您的消費者屬性文件,

spring:
  profiles: dev
  kafka:
    consumer:
      bootstrap-servers: localhost:9092
      key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      value-deserializer: org.springframework.kafka.support.serializer.JsonDeserializer
      group-id: consumer-group-1
      properties:
        spring:
          json:
            value:
              default:
                type: 'com.kafka.consumer.Message'
            type:
              mapping: 'com.kafka.producer.Message:com.kafka.consumer.Message'
            trusted:
              packages: 'com.kafka.producer'

在我的Message類中,確保你有 setter 和 getter 以及 NoArgsConstructor,否則它將無法工作。

@AllArgsConstructor
@Setter
@Getter
@NoArgsConstructor
public
class Message {

    Integer id;

    String name;

    String address;

    String phone;

    boolean isActive;
}

消費者監聽器:

@Component
@Slf4j
public class KafkaMessageListener {

    @KafkaListener(topics = {"test_json"})
    public void OnMessage(Message rc){
        log.info(rc.getName());
    }
}

注意:我的生產者和消費者是兩個不同的項目,它們的包名稱如下,

生產者: com.kafka.producer.Message

消費者: com.kafka.consumer.Message

我遇到了同樣的問題,結果發現我的 application.yml 文件中有太多縮進。 我對 Springboot 完全陌生。

前:

  port: 5000

spring:
  data:
    mongodb:
      host: localhost
      port: 27017
      database: bankaccount
    kafka:
      producer:
        bootstrap-servers: localhost:9092
        key-serializer: org.apache.kafka.common.serialization.StringSerializer
        value-serializer: org.springframework.kafka.support.serializer.JsonSerializer

后:

  port: 5000

spring:
  data:
    mongodb:
      host: localhost
      port: 27017
      database: bankaccount
  kafka:
    producer:
      bootstrap-servers: localhost:9092
      key-serializer: org.apache.kafka.common.serialization.StringSerializer
      value-serializer: org.springframework.kafka.support.serializer.JsonSerializer

為消費者和生產者使用相同的包名,它解決了我的錯誤

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM