简体   繁体   中英

Convert value string to Date, Scala Spark

I am getting a value from a DF using max aggregation, so I get a string and I want to convert it to Date . What I am doing is this:

var date = spark.read.parquet("data/users").select("Date").agg(max(col("Date"))).first.get(0).toString
df2 = table_read.filter("Date=" + lastDate)

In this way I get a variable of string type and now I want to convert it to Date type. I have been searching to do this in another answers but all I saw is to do it with DataFrames and using to_date. How can I do in this case?

EDIT :

Schema :

root
 |-- Date: date (nullable = false)
 |-- op: string (nullable = true)
 |-- value: string (nullable = true)

Output of spark.read.parquet("data/users").select("Date").agg(max(col("Date"))).show :

+-----------+
|max(Date)  |
+-----------+
|2019-11-10 |
+-----------+

Error:

Exception message: cannot resolve '(`Date` = ((2021 - 12) - 14))' due to data type mismatch: differing types in '(`Date` = ((2021 - 12) - 14))' (date and int).; line 1 pos 0;

'Filter (Date#5488 = ((2021 - 12) - 14))

You can use .getDate , eg

var date = spark.read.parquet("data/users").select("Date").agg(max(col("Date"))).first.getDate(0)

To use it in a filter, you can do

df2 = table_read.filter(col("Date") === lastDate)
// or df2 = table_read.filter("date='" + date + "'")

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM