简体   繁体   English

Log4j 文件旋转在火花纱中不起作用

[英]Log4j filerotation doesn't work in spark-yarn

I submit my application to yarn with a custom log4j property.我使用自定义 log4j 属性将我的申请提交给 yarn。 The custom config itself is recognized and used by both the driver and executor except the log rotation part.除了日志轮换部分之外,驱动程序和执行程序都可以识别和使用自定义配置本身。 (The log rotation itself works well without yarn) (日志旋转本身在没有纱线的情况下也能很好地工作)

Hadoop: 2.7.0
Spark : 3.1.1
OS: Windows10

spark-submit:火花提交:

./bin/spark-submit.cmd --class scala_spark.ScalaApp --master yarn --files "./myapp/log4j-test.properties" --conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=file:log4j-test.properties" --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=file:log4j-test.properties" --deploy-mode cluster ./myapp/myApp.jar

log4j-test.properties: log4j-test.properties:

# Set root logger level to DEBUG and its only appender to A1.
log4j.rootLogger=INFO, A1, my_file

log4j.logger.scala_spark=DEBUG

 # A1 Appender
log4j.appender.A1=org.apache.log4j.ConsoleAppender       
log4j.appender.A1.layout=org.apache.log4j.PatternLayout
log4j.appender.A1.layout.ConversionPattern=%-4r [%t] %-5p %c %x - %m%n

# file Appender
log4j.appender.my_file=org.apache.log4j.FileAppender
log4j.appender.my_file.append=true
log4j.appender.my_file.file=./myApp.log
log4j.appender.my_file.threshold=INFO
log4j.appender.my_file.layout=org.apache.log4j.PatternLayout
log4j.appender.my_file.layout.ConversionPattern=%-5p %c: %m%n

Result:结果:
In $HADOOP_HOME\logs\userlogs\application_1617490627891_0001\container_1617490627891_0001_01_000001 only stderr and stdout can be found but not the myApp.log$HADOOP_HOME\logs\userlogs\application_1617490627891_0001\container_1617490627891_0001_01_000001只能找到 stderr 和 stdout,但不能找到 myApp.log

NOTE:笔记:
I am 100% sure that log4j-test.properties is in effect because when I change something like rootlogger to TRACE, then those extra TRACE,DEBUG logs will appear in stdout.我 100% 确定 log4j-test.properties 有效,因为当我将 rootlogger 之类的内容更改为 TRACE 时,那些额外的 TRACE、DEBUG 日志将出现在标准输出中。 If I change to log4j.rootLogger=INFO, my_file, then nothing will printed to stdout ofcz, but myApp.log still nowhere.如果我更改为 log4j.rootLogger=INFO, my_file,那么 stdout ofcz 将不会打印任何内容,但 myApp.log 仍然无处可去。

EDIT:编辑:
I thought maybe the app can't create the file for some reason ( like permission issue), but there are no errors at all in the app,spark,yarn,hdfs logs.我想也许应用程序由于某种原因(如权限问题)无法创建文件,但应用程序、火花、纱线、hdfs 日志中根本没有错误。

Solution :解决方案
I had the change the log file location to:我将日志文件位置更改为:

log4j.appender.file_appender.file=${spark.yarn.app.container.log.dir}/myApp.log

If you need a reference to the proper location to put log files in the YARN so that YARN can properly display and aggregate them, use spark.yarn.app.container.log.dir in your log4j.properties如果您需要引用正确的位置来将日志文件放入 YARN 以便 YARN 可以正确显示和聚合它们,请在 log4j.properties 中使用 spark.yarn.app.container.log.dir

source: https://spark.apache.org/docs/latest/running-on-yarn.html来源: https://spark.apache.org/docs/latest/running-on-yarn.html

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM