I have a DataFrame including data like:
+----+-----+---+-----+
|Year|Month|Day|... |
+----+-----+---+-----+
|2012| 2| 20| |
|2011| 7| 6| |
|2015| 3| 15| |
and i would like to add a column with date
Merge the columns together and then use unix_timestamp
and to_date
to get a timestamp column. For an input dataframe df
:
df.withColumn("merge", concat_ws("-", $"Year", $"Month", $"Day"))
.withColumn("date", to_date(unix_timestamp($"merge", "yyyy-MM-dd").cast("timestamp")))
.drop("merge")
Not so complex as Shaido, just
df.withColumn("date", F.to_date(F.concat_ws("-", "Year", "Month", "Day")) ).show()
Work on spark 2.4 .
对于 Spark 3+,您可以使用make_date
函数:
df.withColumn("date", expr("make_date(Year, Month, Day)"))
You can just use the concat_ws
function to create a date in string data type and just cast that to date.
import org.apache.spark.sql.functions._
import org.apache.spark.sql.types._
//Source Data
val df = Seq((2012,2,20),(2011,7,6),(2015,3,15)).toDF("Year","Month","Day")
//using concat_ws function to create Date column and cast that column data type to date
val df1 = df.withColumn("Date",concat_ws("-",$"Year",$"Month",$"Day"))
.withColumn("Date",$"Date".cast("Date"))
display(df1)
You can see the output as below :
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.