[英]How to convert a string column (column which contains only time and not date ) to time_stamp in spark-scala?
I need to convert the column which contains only time as string to a time stamp type or any other time function which is available in spark.我需要将仅包含时间作为字符串的列转换为时间戳类型或 spark 中可用的任何其他时间函数。
Below is the test Data frame which having "Time_eg" as string column,下面是具有“Time_eg”作为字符串列的测试数据框,
Time_eg
12:49:09 AM
12:50:18 AM
Schema before it convert to the time,转换为时间之前的架构,
Time_eg: string (nullable = true) Time_eg: 字符串 (nullable = true)
//Converting to time stamp
val transType= test.withColumn("Time_eg", test("Time_eg").cast("timestamp"))
Schema After converting to timestamp, the schema is Schema 转换为时间戳后,schema为
Time_eg: timestamp (nullable = true) Time_eg:时间戳(可为空 = 真)
But the output of transType.show()
gives null value for the "Time_eg" column.但是transType.show()
的输出为“Time_eg”列提供了空值。
Please let me know how to convert the column which contains only time as a string to time stamp in spark scala?请让我知道如何将仅包含时间作为字符串的列转换为 spark scala 中的时间戳?
Much appreciate if anyone can help on this?如果有人可以帮助解决这个问题,非常感谢?
Thanks谢谢
You need to use a specific function to convert a string to a timestamp, and specify the format.您需要使用特定函数将字符串转换为时间戳,并指定格式。 Also, a timestamp in Spark represents a full date (with time of the day).此外,Spark 中的时间戳表示完整日期(包含一天中的时间)。 If you do not provide the date, it will be set to 1970, Jan 1st, the begining of unix timestamps.如果您不提供日期,它将设置为 1970 年 1 月 1 日,即 unix 时间戳的开始。
In your case, you can convert your strings as follows:在您的情况下,您可以按如下方式转换字符串:
Seq("12:49:09 AM", "09:00:00 PM")
.toDF("Time_eg")
.select(to_timestamp('Time_eg, "hh:mm:ss aa") as "ts")
.show
+-------------------+
| ts|
+-------------------+
|1970-01-01 00:49:09|
|1970-01-01 21:00:00|
+-------------------+
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.