簡體   English   中英

Kafka 消費者在 SSL 上拋出“OutOfMemoryError:Java 堆空間”錯誤

[英]Kafka Consumer throwing "OutOfMemoryError: Java heap space" Error on SSL

我在Spring Boot 項目中使用Spring-Kafka 2.7.1

當我將它連接到配置了 SSL 的 Kafka Broker 時,它會給出如下所示的“OutofMemory”錯誤,即使我多次增加堆大小無濟於事。

日志如下

java.lang.OutOfMemoryError: Java heap space\
    at java.base/java.nio.HeapByteBuffer.<init>(HeapByteBuffer.java:61) ~[na:na]\
    at java.base/java.nio.ByteBuffer.allocate(ByteBuffer.java:348) ~[na:na]\
    at org.apache.kafka.common.memory.MemoryPool$1.tryAllocate(MemoryPool.java:30) ~[kafka-clients-2.7.1.jar!/:na]\
    at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:113) ~[kafka-clients-2.7.1.jar!/:na]\
    at org.apache.kafka.common.network.KafkaChannel.receive(KafkaChannel.java:447) ~[kafka-clients-2.7.1.jar!/:na]\
    at org.apache.kafka.common.network.KafkaChannel.read(KafkaChannel.java:397) ~[kafka-clients-2.7.1.jar!/:na]\
    at org.apache.kafka.common.network.Selector.attemptRead(Selector.java:674) ~[kafka-clients-2.7.1.jar!/:na]\
    at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:576) ~[kafka-clients-2.7.1.jar!/:na]\
    at org.apache.kafka.common.network.Selector.poll(Selector.java:481) ~[kafka-clients-2.7.1.jar!/:na]\
    at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:563) ~[kafka-clients-2.7.1.jar!/:na]\
    at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:265) ~[kafka-clients-2.7.1.jar!/:na]\
    at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:236) ~[kafka-clients-2.7.1.jar!/:na]\
    at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:215) ~[kafka-clients-2.7.1.jar!/:na]\
    at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.ensureCoordinatorReady(AbstractCoordinator.java:245) ~[kafka-clients-2.7.1.jar!/:na]\
    at org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.poll(ConsumerCoordinator.java:480) ~[kafka-clients-2.7.1.jar!/:na]\
    at org.apache.kafka.clients.consumer.KafkaConsumer.updateAssignmentMetadataIfNeeded(KafkaConsumer.java:1257) ~[kafka-clients-2.7.1.jar!/:na]\
    at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1226) ~[kafka-clients-2.7.1.jar!/:na]\
    at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1206) ~[kafka-clients-2.7.1.jar!/:na]\
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doPoll(KafkaMessageListenerContainer.java:1414) ~[spring-kafka-2.7.7.jar!/:2.7.7]\
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:1251) ~[spring-kafka-2.7.7.jar!/:2.7.7]\
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:1163) ~[spring-kafka-2.7.7.jar!/:2.7.7]\
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[na:na]\
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[na:na]\
    at java.base/java.lang.Thread.run(Thread.java:829) ~[na:na]\

我當前的 YAML 配置如下:

spring:
  kafka:
    bootstrap-servers: KAFKA_BOOTSTRAP_SERVER
    security:
      protocol: "SSL"
  consumer:
    auto-offset-reset: earliest
producer:
  topic: TOPIC
  bootstrap-servers: KAFKA_BOOTSTRAP_SERVER
consumer:
  topic: TOPIC
  bootstrap-servers: KAFKA_BOOTSTRAP_SERVERS

當連接到非 SSL Kafka Broker 時,它按預期工作。

我已經測試了所有其他可能性,並指出它與客戶端的 SSL 配置有關。

嘗試以非安全方式使用Kafka 安全端點時,可能會遇到內存不足錯誤。 (當使用錯誤的安全協議或未通過所需的身份驗證屬性時,這是一個已知問題;OOM 錯誤完全無關,但事實就是如此)

在 Kafka CLI 命令的情況下,通常,屬性文件路徑與命令一起傳遞以提供與安全相關的屬性。

例如:

kafka-topics --command-config <String: filename>
kafka-console-producer --producer.config <String: filename>
kafka-console-consumer --consumer.config <String: filename>

一般包含,

security.protocol=<kafka_security_protocol>
ssl.truststore.location=<ssl_truststore_filename>
ssl.truststore.password=<truststore_password>
ssl.keystore.location=<client_keystore.jks>
ssl.keystore.password=<password>
ssl.key.password=<password>

從這個問題,我假定,無論生產者和消費者組件連接到同一代理(一個或多個)和聲明的所有必需的屬性連接到安全代理下在下面的例子中spring.kafka部。

spring:
  kafka:
    bootstrap-servers: KAFKA_BOOTSTRAP_SERVER
    security:
      protocol: "SSL"
    ssl:
      trust-store-location: "truststore.jks"
      trust-store-password: "<password>"
      key-store-location: "keystore.jks"
      key-store-password: "<password>"
      key-password: "<password>"

如果生產者和消費者連接到不同的代理,則應分別在spring.kafka.producerspring.kafka.consumer部分下指定這些屬性。

spring:
  kafka:
    bootstrap-servers: KAFKA_BOOTSTRAP_SERVER
    security:
      protocol: "SSL"
producer:
  topic: TOPIC
  bootstrap-servers: KAFKA_BOOTSTRAP_SERVER
  ssl.protocol: "SSL"
  ssl.endpoint.identification.algorithm: "https"
  ssl:
    keystore-location: "<keystore.jks>"
    keystore-password: "<password>"
consumer:
  topic: TOPIC
  auto-offset-reset: "earliest"
  bootstrap-servers: KAFKA_BOOTSTRAP_SERVERS
  ssl.protocol: "SSL"
  ssl.endpoint.identification.algorithm: "https"
  ssl:
    keystore-location: "<keystore.jks>"
    keystore-password: "<password>"

如果代理端不需要客戶端身份驗證,那么以下是一個最小配置示例:

security.protocol=SSL
ssl.truststore.location=<kafka.client.truststore.jks>
ssl.truststore.password=<password>

如果需要客戶端身份驗證,還需要包含以下屬性。

ssl.keystore.location=<kafka.client.keystore.jks>
ssl.keystore.password=<password>
ssl.key.password=<password>

請注意,Spring Kafka 配置中的屬性命名約定可能會有所不同。

有關 Kafka 安全性的更多詳細信息 -官方文檔

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM