[英]Too many open files on a project using spring-cloud stream and kafka after upgrade to 2.1
在使用多活頁夾kafka(2個代理)和Rabbit(1個代理)的項目上升級到Spring cloud stream 2.1之后,我們面臨着太多打開文件的問題。
打開的文件數量一直增長到os(redhat 7.3)定義的限制。
我使用執行器端點進行監視:/actuator/metrics/process.files.open
{
"name": "process.files.open",
"description": "The open file descriptor count",
"baseUnit": "files",
"measurements": [
{
"statistic": "VALUE",
"value": 1686
}
]
}
使用Spring Cloud Stream 2.0版本時,未觀察到此問題,打開的文件數穩定在80個文件左右。 使用以下確切版本:
+- org.springframework.cloud:spring-cloud-stream-binder-kafka-core:jar:2.0.0.RELEASE:compile
[INFO] | | \- org.springframework.integration:spring-integration-kafka:jar:3.0.3.RELEASE:compile
[INFO] | +- org.apache.kafka:kafka-clients:jar:1.0.2:compile
[INFO] | | +- org.lz4:lz4-java:jar:1.4:compile
[INFO] | | \- org.xerial.snappy:snappy-java:jar:1.1.4:compile
[INFO] | \- org.springframework.kafka:spring-kafka:jar:2.1.10.RELEASE:compile
我懷疑升級到kafka-client 2.0.0是潛在的問題。
根據文檔 ,我第一次嘗試使用kafka-clients 1.0.2嘗試spring cloud stream 2.1,這是可能的,但是我面臨一個問題。 這是我的排除Maven配置:
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream-binder-kafka</artifactId>
<exclusions>
<exclusion>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
</exclusion>
<exclusion>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</exclusion>
<exclusion>
<groupId>org.springframework.integration</groupId>
<artifactId>spring-integration-kafka</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>1.0.2</version>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
<version>2.2.5.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework.integration</groupId>
<artifactId>spring-integration-kafka</artifactId>
<version>3.1.0.RELEASE</version>
</dependency>
我遇到以下錯誤:
java.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.Consumer.poll(Ljava/time/Duration;)Lorg/apache/kafka/clients/consumer/ConsumerRecords;
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:741)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:698)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:264)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java)
at java.base/java.lang.Thread.run(Thread.java:834)
有太多打開文件以及如何診斷的想法嗎? 如何與降級的kafka客戶進行測試?
感謝您的幫助。
java.lang.NoSuchMethodError:org.apache.kafka.clients.consumer.Consumer.poll(Ljava / time / Duration;)Lorg / apache / kafka / clients / consumer / ConsumerRecords;
spring-kafka 2.2.x需要2.0.1 kafka-clients或更高版本。
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.