简体   繁体   中英

Capture spark executor logs in local file on YARN CUSTER MODE

I am running spark streaming in yarn cluster mode and i want to capture logs and write it in driver local file for this I have created custom log4j.properties files in which I have mentioned driver's local file path but I can only see drivers logs in this file, Why my executors logs are not captured in this file and how can I capture executor log. I have tried different approaches and my spark-submit command is as follows:-

spark-submit --master yarn --deploy-mode yarn-cluster
--conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=file:/home/log/conf/log4j.properties"
--conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=file:/home/log/conf/log4j.properties" --class com.Word.count.SparkStream /home/project/WordCount/target/Count-0.0.1-SNAPSHOT.jar

您可以发布您的 log4j.properties.I 假设您可以在执行程序节点本地目录中看到执行程序日志

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM