简体   繁体   中英

Spark SQL Unsupported datatype TimestampType

I am just new to spark and scala.Trying to read a text file and save its a parquet file. For me one of the field I am using is the TimeStamp and its the docs say the spark1.1.0 supports java.util.TimeStamp.

The run time error I am getting while saving to parquet files is

Exception in thread "main" java.lang.RuntimeException: Unsupported datatype TimestampType at scala.sys.package$.error(package.scala:27) at org.apache.spark.sql.parquet.ParquetTypesConverter$$anonfun$fromDataType$2.apply(ParquetTypes.scala:301)

Any recommendation is really appreciable.

Thanks

This is actually a known bug of versions prior to 1.3.0 : https://issues.apache.org/jira/browse/SPARK-4987

A pull request has been already merged : https://github.com/apache/spark/pull/3820 but it won't be available before 1.3.0 is released.

If you're in a rush, you can build the master branch of 1.3.0 locally, otherwise you might have to wait, but there's hope because the RC2 of Spark 1.3.0 began to be voted as the final version yesterday, so if everything's ok with the candidate version, things can move quite fast.

Regards,

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM