简体   繁体   中英

Pyspark: how to get Date from Weeknumber and Year

I have a dataframe with a column containing week number and year. For example: 18/2020 , which corresponds to the first date of 2020-04-27 . How can I extract the complete date column?

Simply use to_date with the format w/yyyy :

df = spark.createDataFrame([(1, "18/2020")], ['id', 'week_year'])
df.withColumn("date", to_date(col("week_year"), "w/yyyy")).show()

#+---+---------+----------+
#| id|week_year|      date|
#+---+---------+----------+
#|  1|  18/2020|2020-04-26|
#+---+---------+----------+

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM