简体   繁体   中英

How to use milliseconds as an argument in PySpark window(). rangebetween function?

I want to use the rangebetween function from Window() in PySpark with a millisecond as an argument.

I'm trying to do this:

df = df.withColumn("timestamp_ms", F.col("Dates").cast("double"))


w = (Window().orderBy("timestamp_ms").rangeBetween(-0.1, 0.0))

However I got an error because of the float argument (-0.1) in rangeBetween method.

py4j.protocol.Py4JError: An error occurred while calling o73.rangeBetween. Trace:
py4j.Py4JException: Method rangeBetween([class java.lang.Double, class java.lang.Double]) does not exist
        at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:318)
        at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:326)
        at py4j.Gateway.invoke(Gateway.java:274)
        at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
        at py4j.commands.CallCommand.execute(CallCommand.java:79)
        at py4j.GatewayConnection.run(GatewayConnection.java:238)
        at java.lang.Thread.run(Thread.java:748)

Is there any alternative way to create these windows between (-0.1 seconds, 0)?

*timestamp_ms is a column of timestamps with millisecond resolution

*Dates is a column with dates in this format: "2019-07-26 08:56:07.171"

Thanks!

I solved this issue doing this:

df = df.withColumn("timestamp_ms", F.col("Dates").cast("double"))

df = df.withColumn("timestamp_ms", F.col("timestamp_ms")*1000)

w = (Window().orderBy("timestamp_ms").rangeBetween(-100, 0))

I noticed that if I multiply the "timestamp_ms" column by 1000 I would get a column of timestamps in integer dtype with milliseconds resolution. Now I can use rangeBetween (-0.1 secs, 0) to create the Window().

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM