简体   繁体   中英

How to convert a DateTime with milliseconds into epoch time with milliseconds

I have data in hive table in the below format.

2019-11-21 18:19:15.817

I wrote a sql query as below to get the above column value into epoch format.

val newDF = spark.sql(f"""select TRIM(id) as ID, unix_timestamp(sig_ts) as SIG_TS from table""")

And I am getting the output column SIG_TS as 1574360296 which is not having milliseconds.

How to get the epoch timestamp of a date with milliseconds?

Simple way: Create an UDF since spark's built-in function truncates at seconds.

import java.sql.Timestamp

val fullTimestampUDF = udf{t: Timestamp => t.getTime}
val df = Seq("2019-11-21 18:19:15.817").toDF("sig_ts")
    .withColumn("sig_ts_ut", unix_timestamp($"sig_ts"))
    .withColumn("sig_ts_ut_long", fullTimestampUDF($"sig_ts"))

df.show(false)
+-----------------------+----------+--------------+
|sig_ts                 |sig_ts_ut |sig_ts_ut_long|
+-----------------------+----------+--------------+
|2019-11-21 18:19:15.817|1574356755|1574356755817 |
+-----------------------+----------+--------------+

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM