简体   繁体   English

如何在 PySpark 中转换“/Date(epoch time)/”字符串

[英]How to convert "/Date(epoch time)/" string in PySpark

I have a json file where all the dates in the json are in /Date(1602949450000)/我有一个 json 文件,其中 json 中的所有日期都在/Date(1602949450000)/

the json is also nested. json 也是嵌套的。 is there a generic way to parse all of /Date()/ into the timestamp?有没有一种通用的方法可以将所有/Date()/解析为时间戳?

I tried regexp_replace but I cannot convert the capture group into a timestamp.我试过regexp_replace但我无法将捕获组转换为时间戳。

regexp_replace("value", "\\/Date\\((\\-?\\d*?)([\\+\\-]\\d*)?\\)\\/","$1")

Your regex is correct, but the issue seems to be "$1" .您的正则表达式是正确的,但问题似乎是"$1" When I ran the code with a 1 instead worked for me:当我用1运行代码而不是为我工作时:

dates_df = dates_df.withColumn(
    'date_extracted',
    F.regexp_extract("date", "\\/Date\\((\\-?\\d*?)([\\+\\-]\\d*)?\\)\\/", 1)
)

A more simple regex which does the same thing would be:一个更简单的正则表达式做同样的事情是:

dates_df = dates_df.withColumn(
    'date_extracted',
    F.regexp_extract("date", "^.+(\d{13}).+$", 1)
)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM