简体   繁体   中英

Error writing parquet file in S3 with Apache Spark in Scala

I'm using databricks to run my spark cluster( On-demand / Spark 1.6.0 (Hadoop 1) ), and using a scala notebook.

The fist time I try to create a parquet file in my S3 bucket with scala throw me this error:

SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.

But when I check my bucket the files are there, if I run again the cell with the same code, works fine, it seems like the first run always fail.

This is my code :

data_frame.write.mode("append").partitionBy("date").parquet("s3n://...")

Resolved in the comments:

This isn't an exception, it's related to the configuration of log4j. This shouldn't be a problem. – Yuval Itzchakov

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM