简体   繁体   English

如何更改 Kafka 客户端日志记录级别/首选项?

[英]How to change the Kafka client logging levels/preferences?

I am using a plain Java project to run (no framework) a Kafka producer and a consumer.我正在使用一个普通的 Java 项目来运行(无框架)Kafka 生产者和消费者。

I am trying to control the logs generated by the KafkaProducer and KafkaConsumer code and I cannot influence it using the log4j.properties configuration:我试图控制由KafkaProducerKafkaConsumer代码生成的日志,但我无法使用log4j.properties配置影响它:

log4j.rootLogger=ERROR,stdout

log4j.logger.kafka=ERROR,stdout
log4j.logger.org.apache.kafka.clients.producer.ProducerConfig=ERROR,stdout
log4j.logger.org.apache.kafka.common.utils.AppInfoParser=ERROR,stdout
log4j.logger.org.apache.kafka.clients.consumer.internals.AbstractCoordinator=ERROR,stdout

log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=[%d] %p %m (%c)%n

Still I get log output like the one below, whatever settings I provide in the log4j.properties file:无论我在log4j.properties文件中提供什么设置,我仍然会得到如下所示的日志输出:

[main] INFO org.apache.kafka.clients.producer.ProducerConfig - ProducerConfig values:
...
[main] INFO org.apache.kafka.clients.producer.ProducerConfig - ProducerConfig values:
...
[main] INFO org.apache.kafka.clients.producer.ProducerConfig - ProducerConfig values:
...
[main] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator - [Consumer clientId=UM00160, groupId=string-group] (Re-)joining group

How can I control the logging of the Kafka clients library?如何控制 Kafka 客户端库的日志记录? What am I missing to link my log4j.properties file to the Kafka clients library logging?我错过了什么将我的log4j.properties文件链接到 Kafka 客户端库日志记录? In order not to spam the output I have to run Maven test using: mvn test 2> /dev/null .为了不发送垃圾邮件,我必须使用以下命令运行 Maven 测试: mvn test 2> /dev/null Can I configure this via the log4j.properties .我可以通过log4j.properties配置它吗?

Context:语境:

I have the following relevant files:我有以下相关文件:

── test
   ├── java
   │   └── com
   │       └── example
   │           ├── PropertyReader.java
   │           └── strings
   │               └── TestKafkaStringValues.java
   └── resources
       ├── application.properties
       └── log4j.properties

And I am trying to run the TestKafkaStringValues.java both using the Maven surefire plugin ( mvn test ) or the Eclipse JUnit plugin (equivalent to java ... ).我正在尝试使用 Maven TestKafkaStringValues.java插件( mvn test )或 Eclipse JUnit 插件(相当于java ... )运行TestKafkaStringValues.java

For surefire I use the following configuration in the Maven pom.xml :为了确保我在 Maven pom.xml使用以下配置:

<plugin>
    <artifactId>maven-surefire-plugin</artifactId>
    <version>2.22.2</version>
    <configuration>
        <systemPropertyVariables>
            <log4j.configuration>file:log4j.properties</log4j.configuration>
        </systemPropertyVariables>
    </configuration>
</plugin>

and for JUnit I use the following Java VM argument: -Dlog4j.configuration=log4j.properties .对于 JUnit,我使用以下 Java VM 参数: -Dlog4j.configuration=log4j.properties

I also tried in both cases to use the absolute path to log4j.properties .我还尝试在这两种情况下使用log4j.properties的绝对路径。 Still not working.还是行不通。

You can see the complete code here .您可以在此处查看完整代码。

The problem in the code above was that the Maven runtime dependencies (the actual Log4j logging implementation was missing).上面代码中的问题是 Maven 运行时依赖项(缺少实际的 Log4j 日志记录实现)。 In the pom, the slf4j-simple logging implementation was provided.在 pom 中,提供了slf4j-simple logging 实现。 This implementation was:这个实现是:

  • able print the Kafka logs to stdout能够将 Kafka 日志打印到标准输出
  • NOT able to understand the log4j.properties or -Dlog4j.* properties.无法理解log4j.properties-Dlog4j.*属性。

Hence, once would have to include in a Log4J implementation.因此,曾经必须包含在 Log4J 实现中。 Here one would have the choice for Log4j 1.x (End of life) or Log4j2 .在这里,可以选择Log4j 1.x (生命周期结束)或Log4j2

With the following configuration, one should be able to have a very comprehensive/granular control over the logging (including the Kafka clients).通过以下配置,您应该能够对日志记录(包括 Kafka 客户端)进行非常全面/精细的控制。

In the pom.xml :pom.xml

<dependency>
    <groupId>org.apache.logging.log4j</groupId>
    <artifactId>log4j-api</artifactId>
    <version>2.13.1</version>
</dependency>
<dependency>
    <groupId>org.apache.logging.log4j</groupId>
    <artifactId>log4j-core</artifactId>
    <version>2.13.1</version>
</dependency>
<dependency>
    <groupId>org.apache.logging.log4j</groupId>
    <artifactId>log4j-slf4j-impl</artifactId>
    <version>2.13.1</version>
    <scope>test</scope>
</dependency>

While the log4j-api and log4j-core are the minimum requirements you would need.虽然log4j-apilog4j-core是您需要的最低要求。 In order for Log4j2 to be able to control/configure also libraries/components written on top of SLF4J (and Kafka client is such a library), you need to add the 3rd dependecy: log4j-slf4j-impl .为了让 Log4j2 也能够控制/配置在 SLF4J 之上编写的库/组件(而 Kafka 客户端就是这样的库),您需要添加第三个依赖项: log4j-slf4j-impl

NOTE: Note that for libraries that use SLF4J 1.8.x and higher, you will need another version of this Log4j-SLF4J adapter.注意:请注意,对于使用 SLF4J 1.8.x 及更高版本的库,您将需要此 Log4j-SLF4J 适配器的另一个版本。 See this for more information .有关详细信息,请参阅此内容

Now regarding configuring the logging, Log4j2 is automatically loading the configuration files it it finds them, automatically searching in multiple locations .现在关于配置日志记录, Log4j2会自动加载它找到的配置文件,在 多个位置自动搜索。

If you place the following log4j2.properties file in the resource classpath (in src/java/resources/ for main code and in src/test/resource for test code) you will get the desired outcome:如果您将以下log4j2.properties文件放在资源类路径中(在src/java/resources/用于主代码,在src/test/resource用于测试代码),您将获得所需的结果:

rootLogger.level = info
rootLogger.appenderRefs = stdout
rootLogger.appenderRef.stdout.ref = STDOUT

appenders = stdout

appender.stdout.name = STDOUT
appender.stdout.type = Console
appender.stdout.layout.type = PatternLayout
appender.stdout.layout.pattern =%d{yyyy-MM-dd HH:mm:ss.SSS} [%level] [%t] %c - %m%n

loggers = kafka, kafka-consumer

logger.kafka.name = org.apache.kafka
logger.kafka.level = warn

logger.kafka-consumer.name = org.apache.kafka.clients.consumer
logger.kafka-consumer.level = info

In the above example, all logging is written to stdout and: * the root logger is logging info and above * all org.apache.kafka -prefixed loggers log warn and above * all org.apache.kafka.clients.consumer -prefixed loggers are logging info and above在上面的示例中,所有日志记录都写入stdout并且: * 根记录器正在记录info及以上 * 所有org.apache.kafka -prefixed loggers log warn及以上 * 所有org.apache.kafka.clients.consumer -prefixed loggers正在记录info及以上

Here are some extra observations when using Log4j2:以下是使用 Log4j2 时的一些额外观察:

  • if you want JSON or YAML configuration you need extra dependecies如果你想要 JSON 或 YAML 配置,你需要额外的依赖
  • the JUnit plugin in Eclipse will silently terminate without any output if the Log4j configuration is not correct.如果 Log4j 配置不正确,Eclipse 中的 JUnit 插件将在没有任何输出的情况下静默终止。 mvn output will show you the error though. mvn输出会显示错误。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM