简体   繁体   中英

Spark: convert string to Date

I'm using spark/scala I have a dataframe. There are columns year/month/day with value, for ex. 2020/9/2. How can I add a column to the same dataframe with conversion to datetime (yyyy-mm-dd)? I found how to convert date from String to Date format, but I can't find a solution how to combine values and convert it to datetime. thanks for any advice or hint

You can use the to_date function.

val df1 = Seq(
  ("2020/9/2"),
  ("2020/9/15"),
  ("2020/9/30")
).toDF("str")
val df2 = df1.withColumn("dt", to_date(col("str"), "y/M/d"))
df2.show()

I do some test, I think you can use my examples to convert date. I hope I can help you.

package com.jackpan.spark.examples

import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.types._
import org.apache.spark.sql.functions._
object SomeExamples {


  def main(args: Array[String]): Unit = {
    val spark = SparkSession
      .builder()
      .appName("SomeExamples")
      .getOrCreate()

    val dataDF = spark.createDataFrame(Seq(("2022", "12", "09"), ("2022", "12", "19"),
      ("2022", "12", "15"))).toDF("year", "month", "day")

    dataDF.withColumn("dateStr",
      concat(col("year"), lit("-"),col("month"), lit("-"), col("day")))
      .withColumn("date", to_date(col("dateStr"), "yyyy-MM-dd"))
      .show(false)


  }
}

And this function show result is diplay like below :

+----+-----+---+----------+----------+
|year|month|day|dateStr   |date      |
+----+-----+---+----------+----------+
|2022|12   |09 |2022-12-09|2022-12-09|
|2022|12   |19 |2022-12-19|2022-12-19|
|2022|12   |15 |2022-12-15|2022-12-15|
+----+-----+---+----------+----------+

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM