简体   繁体   English

升级到2.1后,使用spring-cloud流和kafka的项目中打开的文件过多

[英]Too many open files on a project using spring-cloud stream and kafka after upgrade to 2.1

After upgrade to spring cloud stream 2.1 on a project using multi binder kafka(2 brokers) and rabbit(1 broker) we are facing too many open files problem. 在使用多活页夹kafka(2个代理)和Rabbit(1个代理)的项目上升级到Spring cloud stream 2.1之后,我们面临着太多打开文件的问题。

The number of files opened keeps growing to the limit defined by the os(redhat 7.3). 打开的文件数量一直增长到os(redhat 7.3)定义的限制。

I use the actuator endpoint to monitor: /actuator/metrics/process.files.open 我使用执行器端点进行监视:/actuator/metrics/process.files.open

{
"name": "process.files.open",
"description": "The open file descriptor count",
"baseUnit": "files",
"measurements": [
{
"statistic": "VALUE",
"value": 1686
}

]
}

Using the version 2.0 of spring cloud stream the problem is not observed, the number of open files is stable aroun 80 files. 使用Spring Cloud Stream 2.0版本时,未观察到此问题,打开的文件数稳定在80个文件左右。 Using the exact following versions: 使用以下确切版本:

 +- org.springframework.cloud:spring-cloud-stream-binder-kafka-core:jar:2.0.0.RELEASE:compile
[INFO] |  |  \- org.springframework.integration:spring-integration-kafka:jar:3.0.3.RELEASE:compile
[INFO] |  +- org.apache.kafka:kafka-clients:jar:1.0.2:compile
[INFO] |  |  +- org.lz4:lz4-java:jar:1.4:compile
[INFO] |  |  \- org.xerial.snappy:snappy-java:jar:1.1.4:compile
[INFO] |  \- org.springframework.kafka:spring-kafka:jar:2.1.10.RELEASE:compile

I'm suspecting that the upgrade to kafka-client 2.0.0 is the potential issue. 我怀疑升级到kafka-client 2.0.0是潜在的问题。

In a first attempt I wanted to try the spring cloud stream 2.1 with kafka-clients 1.0.2, according to docs it's possible but I'm facing an issue. 根据文档 ,我第一次尝试使用kafka-clients 1.0.2尝试spring cloud stream 2.1,这是可能的,但是我面临一个问题。 Here's my maven configuration with exclusion: 这是我的排除Maven配置:

<dependency>
    <groupId>org.springframework.cloud</groupId>
    <artifactId>spring-cloud-stream-binder-kafka</artifactId>
    <exclusions>
        <exclusion>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-clients</artifactId>
        </exclusion>
        <exclusion>
            <groupId>org.springframework.kafka</groupId>
            <artifactId>spring-kafka</artifactId>
        </exclusion>
        <exclusion>
            <groupId>org.springframework.integration</groupId>
            <artifactId>spring-integration-kafka</artifactId>
        </exclusion>
    </exclusions>
</dependency>
<dependency>
    <groupId>org.apache.kafka</groupId>
    <artifactId>kafka-clients</artifactId>
    <version>1.0.2</version>
</dependency>
<dependency>
    <groupId>org.springframework.kafka</groupId>
    <artifactId>spring-kafka</artifactId>
    <version>2.2.5.RELEASE</version>
</dependency>
<dependency>
    <groupId>org.springframework.integration</groupId>
    <artifactId>spring-integration-kafka</artifactId>
    <version>3.1.0.RELEASE</version>
</dependency>

I'm having the following error: 我遇到以下错误:

java.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.Consumer.poll(Ljava/time/Duration;)Lorg/apache/kafka/clients/consumer/ConsumerRecords;
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:741)
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:698)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:264)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java)
    at java.base/java.lang.Thread.run(Thread.java:834)

Any idea about too many open files and how to diagnose? 有太多打开文件以及如何诊断的想法吗? How can I test with downgraded kafka clients? 如何与降级的kafka客户进行测试?

Thanks for you help. 感谢您的帮助。

java.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.Consumer.poll(Ljava/time/Duration;)Lorg/apache/kafka/clients/consumer/ConsumerRecords; java.lang.NoSuchMethodError:org.apache.kafka.clients.consumer.Consumer.poll(Ljava / time / Duration;)Lorg / apache / kafka / clients / consumer / ConsumerRecords;

spring-kafka 2.2.x needs 2.0.1 kafka-clients or later. spring-kafka 2.2.x需要2.0.1 kafka-clients或更高版本。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Spring-cloud kafka stream 模式注册表 - Spring-cloud kafka stream schema registry Springboot 版本从 2.1.6 升级到 2.2.1 和 spring-cloud 问题 - Springboot Version upgrade from 2.1.6 to 2.2.1 and spring-cloud issue 使用 Spring-Cloud Open Feign,我如何读取另一种类型的嵌套 JSON 数据? - using Spring-Cloud Open Feign, how do i read nested JSON data of another type? 微服务系统中服务之间的分布式事务,使用spring-cloud - Distributed transaction among services in a microservice system, using spring-cloud spring-cloud:禁用 CSRF - spring-cloud: Disable CSRF 春季云流与绑定卡夫卡 - Spring cloud stream with bind kafka 在启用批处理模式的情况下,使用 Spring Cloud Stream 在 Kafka 中实现 DLQ - Implementing DLQ in Kafka using Spring Cloud Stream with Batch mode enabled 如何使用 Spring Cloud Stream Supplier 向 Kafka 发送密钥消息 - How to send keyed message to Kafka using Spring Cloud Stream Supplier 无法使用 spring 云流 kafka 发送自定义标头 - Unable to send custom header using spring cloud stream kafka 使用spring cloud stream kafka读取消息的编程方式 - Programmatic way to read message using spring cloud stream kafka
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM