简体   繁体   中英

Spark Executor logs in local or standalone mode

I am running spark submit job in my local environment and to debug the whole process want to see the executor logs. For this I have made following changes:-

  1. Editing the log4j property file - 2 property files, one each for executor and one for driver

    log4j.rootCategory=DEBUG, file log4j.appender.file=org.apache.log4j.FileAppender log4j.appender.file.File=/tmp/executor-application.log log4j.appender.file.append=false log4j.appender.file.layout=org.apache.log4j.PatternLayout log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n

2- Adding the log details into spark-default.conf file

 spark.executor.extraJavaOptions=-Dlog4j.configuration=file:/spark-setup/conf/log4j-executor.properties
 spark.driver.extraJavaOptions=-Dlog4j.configuration=file:/spark-setup/conf/log4j-driver.properties

When I run the spark-submit job locally, I see only the driver logs not the executor log.

spark-submit --master "local[*]" --class com.test.action.myjob test_job.jar 

Am I missing something?? why I can't see the executor logs?? Any pointers will help.

thanks

Configure logging options:

You may need to configure logging options for executors in logback-spark-executor.xml .

Potentially add settings for rolling executor logging: You may want to add some configuration settings in spark-daemon-defaults.conf if you want rolling executor logging .

Runtime variables: Note that there are some RTE (runtime) variables related to the executor logs that can be set.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM