[英]Spark Executor logs in local or standalone mode
I am running spark submit job in my local environment and to debug the whole process want to see the executor logs.我在本地环境中运行 spark 提交作业并调试整个过程想要查看执行程序日志。 For this I have made following changes:-
为此,我进行了以下更改:-
Editing the log4j property file - 2 property files, one each for executor and one for driver编辑 log4j 属性文件 - 2 个属性文件,一个用于执行程序,一个用于驱动程序
log4j.rootCategory=DEBUG, file log4j.appender.file=org.apache.log4j.FileAppender log4j.appender.file.File=/tmp/executor-application.log log4j.appender.file.append=false log4j.appender.file.layout=org.apache.log4j.PatternLayout log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n
2- Adding the log details into spark-default.conf file 2-将日志详细信息添加到spark-default.conf文件中
spark.executor.extraJavaOptions=-Dlog4j.configuration=file:/spark-setup/conf/log4j-executor.properties
spark.driver.extraJavaOptions=-Dlog4j.configuration=file:/spark-setup/conf/log4j-driver.properties
When I run the spark-submit job locally, I see only the driver logs not the executor log.当我在本地运行 spark-submit 作业时,我只看到驱动程序日志而不是执行程序日志。
spark-submit --master "local[*]" --class com.test.action.myjob test_job.jar
Am I missing something??我错过了什么吗? why I can't see the executor logs??
为什么我看不到执行程序日志? Any pointers will help.
任何指针都会有所帮助。
thanks谢谢
Configure logging options:配置日志记录选项:
You may need to configure logging options for executors in logback-spark-executor.xml
.您可能需要在
logback-spark-executor.xml
中为执行程序配置日志记录选项。
Potentially add settings for rolling executor logging: You may want to add some configuration settings in spark-daemon-defaults.conf
if you want rolling executor logging .可能为滚动执行程序日志添加设置:如果您想要滚动执行程序日志记录,您可能需要在
spark-daemon-defaults.conf
中添加一些配置设置。
Runtime variables: Note that there are some RTE (runtime) variables related to the executor logs that can be set.运行时变量:注意有一些与执行器日志相关的RTE(运行时)变量可以设置。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.