[英]how to get aggregated logs from running Spark application
I have a continuous running Spark Application which reads msgs from Kafka and does some processing. 我有一个连续运行的Spark应用程序,它从Kafka读取msgs并进行一些处理。 Is there a way to get aggregated "Application Logs" ? 有没有一种方法可以汇总“应用程序日志”?
AFAIK, log aggregation will happen only when SparkContext is destroyed. AFAIK,仅当SparkContext被销毁时才发生日志聚合。
Have you tried standard YARN command for logs? 您是否尝试过将标准YARN命令用于日志?
yarn logs -applicationId some-id 纱线日志-applicationId some-id
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.