简体   繁体   English

如何使用Apache Spark将日志写入文件

[英]How to write logs in a file using apache spark

I am working on a maven project. 我正在做一个Maven项目。 Apache spark has its own log4j functionality. Apache Spark具有自己的log4j功能。 Is there a way using which I can write apache spark logs and my own log statements to a file? 有没有一种方法可以将Apache Spark日志和自己的日志语句写入文件?

As any other java application, you need jars in classpath and you need a log4j config file. 与其他Java应用程序一样,您需要在classpath中使用jars,并且需要一个log4j配置文件。

[1] Jars in classpath [1]在类路径中的罐子

Spark already comes with these 2 in its jars folder (under $SPARK_HOME ): Spark已经在其jars文件夹(位于$SPARK_HOME )中附带了以下两个$SPARK_HOME

slf4j-api-1.7.16.jar 
slf4j-log4j12-1.7.16.jar

So you just need a "provided" in your pom so you can compile with it: 因此,您只需要在pom中添加一个“ provid”,即可对其进行编译:

<dependency>
    <groupId>org.slf4j</groupId>
    <artifactId>slf4j-api</artifactId>
    <version>1.7.16</version>
    <scope>provided</scope>
</dependency>

Note that a different version of Spark may come with a different version of slf so take a look in the [jars] folder to match the correct version to the Spark you use. 请注意,不同版本的slf可能附带有不同版本的Spark,因此请在[jars]文件夹中查看以将正确版本与您使用的Spark匹配。

[2] A log4j config file [2]一个log4j配置文件

  • Go to the conf folder under spark home dir. 转到spark主页目录下的conf文件夹。
  • Rename this file log4j.properties.template to log4j.properties 将此文件log4j.properties.template重命名为log4j.properties
  • Add your config 添加您的配置

Now you can add log.debug in your code. 现在,您可以在代码中添加log.debug

Note that code may run on the Driver or the Executor so the logs will be in different machines depending on where it runs. 请注意,代码可能在驱动程序或执行程序上运行,因此日志将在不同的计算机上运行,​​具体取决于运行的位置。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何在apache spark java中使用hadoop office库将数据集写入excel文件 - How to write Dataset to a excel file using hadoop office library in apache spark java Logback:如何将日志写入文件 - Logback: how write logs to file 使用Apache Spark将RDD写为文本文件 - Write RDD as textfile using Apache Spark 如何使用Log4j和Storm Framework将日志写入文件? - How to write logs to a file using Log4j and Storm Framework? 使用 Apache Spark 和 Java 按列分组并将每组字符串写入文本文件 - Group by column and write each group of strings to text file using Apache Spark and Java 使用Apache Flume将日志从MapReduce作业写入HDFS - Using Apache Flume to write logs from MapReduce job into HDFS 如何从 Java 连接到 csv 文件并将其写入 Databricks Apache Spark 的远程实例? - How do I connect to and write a csv file to a remote instance of Databricks Apache Spark from Java? 如何在btrace中将日志写入文件? - How can write logs to a file in btrace? saveAsTextFile()将最终的RDD写为单个文本文件--Apache Spark - saveAsTextFile() to write the final RDD as single text file - Apache Spark 如何使用Apache POI将300万行写入Excel文件 - How to write 3 Million rows to an Excel File Using Apache POI
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM