简体   繁体   中英

Pyspark. How to extract a datetime value from single row dataframe?

I've got a spark dataframe, and trying to get a value for next using.

max_dttm = spark.sql("""select max(issue_dttm) from psycho_sb.yso_sendsay_im_issues""")

It looks like that:

+-------------------+
|    max(issue_dttm)|
+-------------------+
|2018-12-25 09:01:30|
+-------------------+

How i can extract that value in timestamp/datetime format? I want to assign it to a variable.

Something like this :

max_dttm = spark.sql("""select max(issue_dttm) as max_issue_dttm from psycho_sb.yso_sendsay_im_issues""")
max_issue_dttm = max_dttm.collect()[0].max_issue_dttm 

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM