[英]Set Apache Storm and Flink log level to display debug messages
So I'm building a JAR with Storm and Flink applications where I log messages as the following:所以我正在构建一个带有 Storm 和 Flink 应用程序的 JAR,我在其中记录如下消息:
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
// ...
private static final Logger LOG = LoggerFactory.getLogger(Some.class);
// ...
LOG.debug("...");
LOG.info("...");
LOG.error("...");
Then I pass the JAR to .../bin/storm
and .../bin/flink
scripts and everything works, but the log level is set to INFO
, I'd like to also display DEBUG
messages from my package only .然后我将 JAR 传递给
.../bin/storm
和.../bin/flink
脚本,一切正常,但日志级别设置为INFO
,我还想仅显示来自我的 package 的DEBUG
消息。
I tried several things but I feel I'm just trying random things from the internet as I can't find an authoritative reference about how to obtain this and I'm having hard time wrapping my head around the incredibly confusing state of the log facilities for Java...我尝试了几件事,但我觉得我只是在尝试从互联网上随机做的事情,因为我找不到关于如何获得这个的权威参考资料,而且我很难理解日志设施令人难以置信的令人困惑的 state对于 Java...
I'm asking about both Storm and Flink as I suspect that the root of my problem is the same but I might be wrong.我在询问 Storm 和 Flink,因为我怀疑我的问题的根源是相同的,但我可能错了。 Also I apologize if I don't provide a minimal example but there's really nothing to provide here.
如果我没有提供一个最小的例子,我也很抱歉,但这里真的没有什么可提供的。
Please let me know if you need additional details.如果您需要更多详细信息,请告诉我。
For Storm, your log configuration is in storm/log4j2/worker.xml
.对于 Storm,您的日志配置位于
storm/log4j2/worker.xml
中。 It's a log4j2 configuration file, so you can find out what options there are by looking at the log4j2 documentation here https://logging.apache.org/log4j/2.x/manual/configuration.html .这是一个 log4j2 配置文件,因此您可以通过查看此处的 log4j2 文档了解有哪些选项https://logging.apache.org/log4j/2.x/manual/configuration.ZFC336EZ888888
I'm not as familiar with Flink, but I'd suspect it's similar.我对 Flink 不太熟悉,但我怀疑它是相似的。 Here is Flink's page on it, which mentions that you should have a
logback.xml
file in your conf
directory. 这是 Flink 的页面,上面提到你的
conf
目录中应该有一个logback.xml
文件。
In order to modify the log level and which classes log on a Flink cluster, please adapt为了修改日志级别以及哪些类在 Flink 集群上登录,请适应
FLINK_HOME/conf/log4j.properties
if you are using log4j
FLINK_HOME/conf/log4j.properties
如果您使用的是log4j
FLINK_HOME/conf/logback.xml
if you are using logback
FLINK_HOME/conf/logback.xml
如果您使用的是logback
before you start the Flink cluster.在启动 Flink 集群之前。
These files will be read when you deploy the Flink cluster.部署 Flink 集群时会读取这些文件。 Note that these settings cannot be changed at runtime, unless you are replacing Flink's
log4j
logger with log4j2
which supports to load settings dynamically.请注意,这些设置不能在运行时更改,除非您将 Flink 的
log4j
记录器替换为支持动态加载设置的log4j2
。
In this scenario:在这种情况下:
Then I pass the JAR to
.../bin/storm
and.../bin/flink
scripts and everything works, but the log level is set toINFO
, I'd like to also displayDEBUG
messages from my package only .然后我将 JAR 传递给
.../bin/storm
和.../bin/flink
脚本,一切正常,但日志级别设置为INFO
,我还想仅显示来自我的 package 的DEBUG
消息。
I ended up with the following suboptimal solution.我最终得到了以下次优解决方案。
For unknown reasons changing the /path/to/storm/log4j2/worker.xml
file has no effect so I need to act programmatically:由于未知原因,更改
/path/to/storm/log4j2/worker.xml
文件无效,因此我需要以编程方式执行操作:
import org.apache.logging.log4j.Level;
import org.apache.logging.log4j.core.config.Configurator;
// ...
Configurator.setLevel("my.package", Level.ALL);
It's enough to add a line to /path/to/flink/conf/log4j.properties
:在
/path/to/flink/conf/log4j.properties
添加一行就足够了:
log4j.logger.my.package=ALL
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.