繁体   English   中英

Scala Spark根据另一列中值的聚合计数在数据框中创建一个新列

[英]Scala Spark creating a new column in the dataframe based on the aggregate count of values in another column

我有一个像下面这样的火花数据框

+-----+----------+----------+
|   ID|      date|  count   |
+-----+----------+----------+
|54500|2016-05-02|         0|
|54500|2016-05-09|         0|
|54500|2016-05-16|         0|
|54500|2016-05-23|         0|
|54500|2016-06-06|         0|
|54500|2016-06-13|         0|
|54441|2016-06-20|         0|
|54441|2016-06-27|         0|
|54441|2016-07-04|         0|
|54441|2016-07-11|         0|
+-----+----------+----------+

我想添加一个额外的列,其中包含数据框中特定 id 的记录计数,同时避免 for 循环。 目标数据框如下所示

+-----+----------+----------+
|   ID|      date|  count   |
+-----+----------+----------+
|54500|2016-05-02|         6|
|54500|2016-05-09|         6|
|54500|2016-05-16|         6|
|54500|2016-05-23|         6|
|54500|2016-06-06|         6|
|54500|2016-06-13|         6|
|54441|2016-06-20|         4|
|54441|2016-06-27|         4|
|54441|2016-07-04|         4|
|54441|2016-07-11|         4|
+-----+----------+----------+

试过这个

import org.apache.spark.sql.expressions.Window

var s = Window.partitionBy("ID")
var df2 = df.withColumn("count", count.over(s))

这是给错误

error: ambiguous reference to overloaded definition,
both method count in object functions of type (columnName: String)org.apache.spark.sql.TypedColumn[Any,Long]
and  method count in object functions of type (e: org.apache.spark.sql.Column)org.apache.spark.sql.Column
match expected type ?

请遵循以下方法:

 import spark.implicits._

val df1 = List(54500, 54500, 54500, 54500, 54500, 54500, 54441, 54441, 54441, 54441).toDF("ID")
val df2 = df1.groupBy("ID").count()
df1.join(df2, Seq("ID"), "left").show(false)

+-----+-----+
|ID   |count|
+-----+-----+
|54500|6    |
|54500|6    |
|54500|6    |
|54500|6    |
|54500|6    |
|54500|6    |
|54441|4    |
|54441|4    |
|54441|4    |
|54441|4    |
+-----+-----+

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM