简体   繁体   中英

How to convert "/Date(epoch time)/" string in PySpark

I have a json file where all the dates in the json are in /Date(1602949450000)/

the json is also nested. is there a generic way to parse all of /Date()/ into the timestamp?

I tried regexp_replace but I cannot convert the capture group into a timestamp.

regexp_replace("value", "\\/Date\\((\\-?\\d*?)([\\+\\-]\\d*)?\\)\\/","$1")

Your regex is correct, but the issue seems to be "$1" . When I ran the code with a 1 instead worked for me:

dates_df = dates_df.withColumn(
    'date_extracted',
    F.regexp_extract("date", "\\/Date\\((\\-?\\d*?)([\\+\\-]\\d*)?\\)\\/", 1)
)

A more simple regex which does the same thing would be:

dates_df = dates_df.withColumn(
    'date_extracted',
    F.regexp_extract("date", "^.+(\d{13}).+$", 1)
)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM