简体   繁体   中英

Filter between datetime ranges with timezone in PySpark for parquet files

Based on the suggestion from here , I would like to know how do I filter the datetime ranges with timezone using PySpark.

Here is how my data looks like:

ABC, 2020-06-22T19:17:16.428+0000

DEF, 2020-06-22T19:17:16.435+0000

JKL, 2020-06-22T19:17:16.468+0000

MNO, 2020-06-22T19:17:16.480+0000

XYZ, 2020-06-22T19:17:16.495+0000

I would only like to extract those records that has milliseconds between 400-450 in this case.

Tried this but didn't work:

import pyspark.sql.functions as func
df = df.select(func.to_date(df.UpdatedOn).alias("time"))
sf = df.filter(df.time > '2020-06-22T19:17:16.400').filter(df.time < '2020-06-22T19:17:16.451')

When you use to_date it will truncate the hours, so you have to use to_timestamp and compare it.

df.withColumn('date', to_timestamp('date')) \
  .filter("date between to_timestamp('2020-06-22T19:17:16.400') and to_timestamp('2020-06-22T19:17:16.451')") \
  .show(10, False)

+---+-----------------------+
|id |date                   |
+---+-----------------------+
|ABC|2020-06-22 19:17:16.428|
|DEF|2020-06-22 19:17:16.435|
+---+-----------------------+

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM