[英]Apache SPark: groupby not working as expected
df df
Date Col1 COl2
2010-01-01 23 28
2012-09-01 50 70
2010-03-04 80 10
2012-04-01 19 20
2012-03-05 67 9
df_new=df.withColumn('year',year(df['Date']))
Date Col1 COl2 year
2010-01-01 23 28 2010
2012-09-01 50 70 2012 and so on
Now, I am trying to find the maximum of Col1 and Col2 for each year.现在,我试图找到每年 Col1 和 Col2 的最大值。 So I use groupby:所以我使用 groupby:
df_new.groupby('year').max().show()
THe result I get is not what I expected.我得到的结果不是我所期望的。 Result obtained获得的结果
year max(year)
2010 2010
2012 2012 and so on
Expected result预期结果
year max(Col1) max(Col2)
2010 80 28
2012 67 70
Check below code.检查下面的代码。
from pyspark.sql import functions as F
df.withColumn('year',F.year(df['date'])).groupBy("year").agg(F.max("col1").alias("max_col1"),F.max("col2").alias("max_col2")).show()
you should perform multiple max
, agg
on the Col1
and Col2
你应该在Col1
和Col2
上执行多个max
, agg
from pyspark.sql import functions as F
df_new.groupBy(F.year("Date")).agg(F.max("Col1"),F.max("Col2"))
.show()
In case you have a huge dataset it is better to use a Window function in such cases as below, This performs way better than groupBy如果您有一个庞大的数据集,最好在以下情况下使用 Window function,这比 groupBy 表现更好
from pyspark.sql import functions as F
from pyspark.sql.window import Window as W
df = spark.table("test_poc")
df = df.withColumn("Year", F.year(F.col('date')))
_w = W.partitionBy(F.col('year'))
df = df.withColumn('max_col', F.max('id').over(_w)).withColumn('min_col', F.min('id').over(_w))
df.show()
---------OUTPUT------------ - - - - -输出 - - - - - -
+---+-------------------+----+-------+-------+
| id| date|Year|max_col|min_col|
+---+-------------------+----+-------+-------+
| 5|2019-12-31 23:26:59|2019| 5| 2|
| 2|2019-12-31 23:26:59|2019| 5| 2|
| 1|1969-12-31 23:26:59|1969| 3| 1|
| 2|1969-12-31 23:26:30|1969| 3| 1|
| 3|1969-12-31 23:26:26|1969| 3| 1|
| 4|2020-12-31 23:26:59|2020| 4| 1|
| 1|2020-12-31 23:26:59|2020| 4| 1|
+---+-------------------+----+-------+-------+
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.