简体   繁体   中英

Trying to write DataFrame to csv file

i am trying to write my DataFrame to CSV file. I tried this

df.write.format("com.databricks.spark.csv").option("header", true)
  .option("codec", "org.apache.hadoop.io.compress.GzipCodec").save("myFile.csv")  

but it gives me the error :

java.lang.UnsupportedOperationException: CSV data source does not support array<struct<columnName:columntype...

if i however show the DataFrame on the console, it prints fine. How can i write to a csv or even a text file would do.

Thanks!!

EDIT

I didn't need to write everything out. I selected out the rows needed and got it working thanks for the help though!

You dataframe has a complex column (an Array of structs it seems like). With csv you can only have simple column types like String, Int, Date, etc, but no arrays or structs.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM