简体   繁体   English

同一列Apache Spark中的日期差异

[英]Date Difference within Same Column Apache Spark

I have one column and I need to find out date difference in days between each row, partitioned by Id.. This have to be done using Spark SQL.我有一列,我需要找出每行之间的日期差异,按 Id 分区。这必须使用 Spark SQL 完成。 I have written below code but somehow the answer is coming wrong.我写了下面的代码,但不知何故答案是错误的。 Kindly let me know where am I going wrong.请让我知道我哪里出错了。

WindowSpec window = Window.partitionBy("id").orderBy("date_time");
Dataset<Row> resultSet = testData.withColumn("day_diff", functions.datediff(col("date_time"), functions.lag(col("date_time"), 1).over(window)));

数据集

You should probably do it one by one.你应该一个一个地做。

  • Step1 : Use window function to collect the Date from previous row.步骤1:使用窗口函数收集前一行的日期。
  • Step2 : Use datediff to find the difference . Step2:使用 datediff 找出差异。
  • drop extra cols删除额外的 cols
testData
.withColumn("prev_date", functions.lag(col("date_time"),1).over(window))
.withColumn("day_diff", functions.datediff(col("date_time")), col("prev_date"))
.drop(col("prev_date"))

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM