简体   繁体   English

如何在 Pyspark 中将字符串更改为时间戳?

[英]How to change the string to timestamp in Pyspark?

I am trying to change a string to a time_stamp in pyspark(Spark version =2.3.0) with below data set and api我正在尝试使用以下数据集和 api 在 pyspark(Spark version =2.3.0) 中将字符串更改为时间戳

I have been trying with different resolution from stack overflow, but nothing could help to change to the time_stamp我一直在尝试从堆栈溢出中使用不同的分辨率,但是没有什么可以帮助更改时间戳

df:
|Customer|Transaction_Timestamp|Transaction_Base_Point_Value|
+--------+---------------------+----------------------------+
|Cust1   |10/25/2017 1:47      |2000                        |

Attempt 1尝试 1

df2 = df.select('Customer', 'Transaction_Timestamp','Transaction_Base_Point_Value', unix_timestamp('Transaction_Timestamp', "dd/MM/yy HH:mm") .cast(TimestampType()).alias("Timestamp")).show(1, False)

Attempt 2尝试 2

df.withColumn('Time', to_timestamp("Transaction_Timestamp", "yyyy_MM_dd hh_mm_ss").cast("Timestamp"))

Attempt 3尝试 3

change_type= df.withColumn('Timestamp', col='Transaction_Timestamp').cast('timestamp')

However, the schema produces the following output但是,模式产生以下输出

 |-- Timestamp: timestamp (nullable = true)

I need to get the output as follows, so that i can perform other operation on timestamp我需要得到如下输出,以便我可以对时间戳执行其他操作

|Customer|Transaction_Timestamp|Transaction_Base_Point_Value|Timestamp|
+--------+---------------------+----------------------------+---------+
|   Cust1|      10/25/2017 1:47|                        2000|     10/25/2017 1:47|

use to_timestamp from pyspark.sql.functions使用to_timestamppyspark.sql.functions

.withColumn('Timestamp', to_timestamp('Transaction_Timestamp', 'MM/dd/yyyy hh:mm'))

also a padded hour value would be nice to have not 1:47 but 01:47如果不是1:47而是01:47那么填充小时值也会很好

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM