简体   繁体   English

Spark Executor 以本地或独立模式登录

[英]Spark Executor logs in local or standalone mode

I am running spark submit job in my local environment and to debug the whole process want to see the executor logs.我在本地环境中运行 spark 提交作业并调试整个过程想要查看执行程序日志。 For this I have made following changes:-为此,我进行了以下更改:-

  1. Editing the log4j property file - 2 property files, one each for executor and one for driver编辑 log4j 属性文件 - 2 个属性文件,一个用于执行程序,一个用于驱动程序

    log4j.rootCategory=DEBUG, file log4j.appender.file=org.apache.log4j.FileAppender log4j.appender.file.File=/tmp/executor-application.log log4j.appender.file.append=false log4j.appender.file.layout=org.apache.log4j.PatternLayout log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n

2- Adding the log details into spark-default.conf file 2-将日志详细信息添加到spark-default.conf文件中

 spark.executor.extraJavaOptions=-Dlog4j.configuration=file:/spark-setup/conf/log4j-executor.properties
 spark.driver.extraJavaOptions=-Dlog4j.configuration=file:/spark-setup/conf/log4j-driver.properties

When I run the spark-submit job locally, I see only the driver logs not the executor log.当我在本地运行 spark-submit 作业时,我只看到驱动程序日志而不是执行程序日志。

spark-submit --master "local[*]" --class com.test.action.myjob test_job.jar 

Am I missing something??我错过了什么吗? why I can't see the executor logs??为什么我看不到执行程序日志? Any pointers will help.任何指针都会有所帮助。

thanks谢谢

Configure logging options:配置日志记录选项:

You may need to configure logging options for executors in logback-spark-executor.xml .您可能需要在logback-spark-executor.xml中为执行程序配置日志记录选项

Potentially add settings for rolling executor logging: You may want to add some configuration settings in spark-daemon-defaults.conf if you want rolling executor logging .可能为滚动执行程序日志添加设置:如果您想要滚动执行程序日志记录,您可能需要在spark-daemon-defaults.conf中添加一些配置设置。

Runtime variables: Note that there are some RTE (runtime) variables related to the executor logs that can be set.运行时变量:注意有一些与执行器日志相关的RTE(运行时)变量可以设置。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 在 YARN CUSTER MODE 的本地文件中捕获 spark 执行程序日志 - Capture spark executor logs in local file on YARN CUSTER MODE 如何在Spark本地模式下配置执行器 - How to configure Executor in Spark Local Mode Spark独立模式与本地模式之间的区别? - Difference between spark standalone and local mode? Spark执行者登录YARN - Spark executor logs on YARN 独立集群模式:spark如何分配spark.executor.cores? - Standalone Cluster Mode: how does spark allocate spark.executor.cores? Spark本地模式:如何查询执行器插槽数? - Spark local mode: How to query the number of executor slots? Apache Spark使用本机依赖关系-独立模式下的驱动程序/执行程序代码流 - Apache spark using native dependencies - driver/executor code flow in standalone mode 如果spark.executor.instances和spark.cores.max不起作用,如何在Spark Standalone模式下增加执行程序的数量 - How to increase the number of executors in Spark Standalone mode if spark.executor.instances and spark.cores.max aren't working 为什么Spark 1.5.2在独立模式下抛出“本地类不兼容”? - Why does Spark 1.5.2 throw “local class incompatible” in standalone mode? Spark Standalone、YARN 和本地模式有什么区别? - What is the difference between Spark Standalone, YARN and local mode?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM