简体   繁体   中英

how to get aggregated logs from running Spark application

I have a continuous running Spark Application which reads msgs from Kafka and does some processing. Is there a way to get aggregated "Application Logs" ?

AFAIK, log aggregation will happen only when SparkContext is destroyed.

Have you tried standard YARN command for logs?

yarn logs -applicationId some-id

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM