![](/img/trans.png)
[英]Convert DataType of all columns of certain DataType to another DataType in Spark DataFrame using Scala
[英]Spark dataFrame convert columns Datatype from String to Date
我有以下數據,有架構
scala> df2.printSchema()
root
|-- RowID: integer (nullable = true)
|-- Order Date: string (nullable = true)
scala> df2.show(5)
+-----+----------+
|RowID|Order Date|
+-----+----------+
| 1| 4/10/15|
| 49| 4/10/15|
| 50| 4/10/15|
| 80| 4/10/15|
| 85| 4/10/15|
+-----+----------+
我想將“訂購日期”字符串列轉換為日期數據類型,然后嘗試以下操作,但沒有運氣,有人可以建議一種更好的方法嗎?
scala> df2.select(df2.col("RowID"), df2.col("Order Date"), date_format(df2.col("Order Date"), "M/dd/yy")).show(5)
+-----+----------+-------------------------------+
|RowID|Order Date|date_format(Order Date,M/dd/yy)|
+-----+----------+-------------------------------+
| 1| 4/10/15| null|
| 49| 4/10/15| null|
| 50| 4/10/15| null|
| 80| 4/10/15| null|
| 85| 4/10/15| null|
+-----+----------+-------------------------------+
設法轉換為unix epoch時間戳,我認為從這里開始很簡單
scala> df.select(df.col("RowID"), df.col("Order Date"), unix_timestamp(df.col("Order Date"), "M/d/yy")).show(5)
+-----+----------+--------------------------------+
|RowID|Order Date|unixtimestamp(Order Date,M/d/yy)|
+-----+----------+--------------------------------+
| 1| 4/10/15| 1428604200|
| 49| 4/10/15| 1428604200|
| 50| 4/10/15| 1428604200|
| 80| 4/10/15| 1428604200|
| 85| 4/10/15| 1428604200|
+-----+----------+--------------------------------+
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.