简体   繁体   English

如何从运行的Spark应用程序获取聚合日志

[英]how to get aggregated logs from running Spark application

I have a continuous running Spark Application which reads msgs from Kafka and does some processing. 我有一个连续运行的Spark应用程序,它从Kafka读取msgs并进行一些处理。 Is there a way to get aggregated "Application Logs" ? 有没有一种方法可以汇总“应用程序日志”?

AFAIK, log aggregation will happen only when SparkContext is destroyed. AFAIK,仅当SparkContext被销毁时才发生日志聚合。

Have you tried standard YARN command for logs? 您是否尝试过将标准YARN命令用于日志?

yarn logs -applicationId some-id 纱线日志-applicationId some-id

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何查看Spark独立群集的聚合日志 - How can I see the aggregated logs for a Spark standalone cluster 将 Logback 中的应用程序日志与 log4j 中的 Spark 日志分开 - Separating application logs in Logback from Spark Logs in log4j 从Eclipse运行Spark应用程序 - Running Spark Application from Eclipse Spark应用程序崩溃后如何在纱线中保留Spark执行程序日志 - How to retain Spark executor logs in Yarn after Spark application is crashed 是否有可能获得已经运行的火花应用程序的sparkcontext? - Is it possible to get sparkcontext of an already running spark application? 从SPARK Web UI杀死Apache Spark应用程序后如何杀死它在后台运行 - How to kill apache spark application running in background after killing it from SPARK web UI 从HDInsight群集头节点运行Spark应用程序 - Running spark application from HDInsight cluster headnode 我正在为我的应用程序使用spark,但是却收到不必要的日志。 如何禁用Spark Java应用程序中的日志 - I am using spark for my application, but i am getting unnecessary logs. How can i disable logs in spark java application 如何优雅地停止运行 Spark Streaming 应用程序? - How to Stop running Spark Streaming application Gracefully? 如何使用默认的集群管理器让工作人员登录spark? - How to get the worker logs in spark using the default cluster manager?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM