简体   繁体   English

如何使用 groupby 和聚合将 pyspark dataframe 中的行与多列连接起来

[英]How can I concatenate the rows in a pyspark dataframe with multiple columns using groupby and aggregate

I have a pyspark dataframe with multiple columns.我有一个多列的 pyspark dataframe。 For example the one below.比如下面这张。

from pyspark.sql import Row
l = [('Jack',"a","p"),('Jack',"b","q"),('Bell',"c","r"),('Bell',"d","s")]
rdd = sc.parallelize(l)
score_rdd = rdd.map(lambda x: Row(name=x[0], letters1=x[1], letters2=x[2]))
score_card = sqlContext.createDataFrame(score_rdd)

+----+--------+--------+
|name|letters1|letters2|
+----+--------+--------+
|Jack|       a|       p|
|Jack|       b|       q|
|Bell|       c|       r|
|Bell|       d|       s|
+----+--------+--------+

Now I want to group by "name" and concatenate the values in every row for both columns.现在我想按“名称”分组并连接两列每一行中的值。 I know how to do it but let's say there are thousands of rows then my code becomes very ugly.我知道该怎么做,但是假设有数千行,那么我的代码就会变得非常难看。 Here is my solution.这是我的解决方案。

import pyspark.sql.functions as f
t = score_card.groupby("name").agg(
    f.concat_ws("",collect_list("letters1").alias("letters1")),
    f.concat_ws("",collect_list("letters2").alias("letters2"))
)

Here is the output I get when I save it in a CSV file.这是我将其保存在 CSV 文件中时得到的 output。

+----+--------+--------+
|name|letters1|letters2|
+----+--------+--------+
|Jack|      ab|      pq|
|Bell|      cd|      rs|
+----+--------+--------+

But my main concern is about these two lines of code但我主要关心的是这两行代码

f.concat_ws("",collect_list("letters1").alias("letters1")),
f.concat_ws("",collect_list("letters2").alias("letters2"))

If there are thousands of columns then I will have to repeat the above code thousands of times.如果有数千列,那么我将不得不重复上述代码数千次。 Is there a simpler solution for this so that I don't have to repeat f.concat_ws() for every column?有没有更简单的解决方案,这样我就不必为每一列重复 f.concat_ws() 了?

I have searched everywhere and haven't been able to find a solution.我到处搜索,但无法找到解决方案。

yes, you can use for loop inside agg function and iterate through df.columns.是的,您可以在 agg function 中使用 for 循环并遍历 df.columns。 Let me know if it helps.让我知道它是否有帮助。

    from pyspark.sql import functions as F
    df.show()

    # +--------+--------+----+
    # |letters1|letters2|name|
    # +--------+--------+----+
    # |       a|       p|Jack|
    # |       b|       q|Jack|
    # |       c|       r|Bell|
    # |       d|       s|Bell|
    # +--------+--------+----+

    df.groupBy("name").agg( *[F.array_join(F.collect_list(column), "").alias(column) for column in df.columns if column !='name' ]).show()

    # +----+--------+--------+
    # |name|letters1|letters2|
    # +----+--------+--------+
    # |Bell|      cd|      rs|
    # |Jack|      ab|      pq|
    # +----+--------+--------+

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Pandas 数据框,groupBy 聚合多列和多行 - Pandas dataframe, groupBy aggregate multiple columns and rows pandas groupby() 与自定义聚合 function 使用 pandas 连接列然后行 - pandas groupby() with custom aggregate function to concatenate columns then rows using pandas 如何通过多列分组和聚合 pandas dataframe - How can I groupby and aggregate pandas dataframe with many columns 如何对pyspark中的spark数据帧中的多列求和? - How can I sum multiple columns in a spark dataframe in pyspark? 如何在数据框中应用groupBy而不删除Pyspark中未分组实例的其他列? - How can I apply groupBy in a dataframe without removing other columns of the not-grouped instances in Pyspark? 使用groupby时如何在一个熊猫中用一个函数聚合多列? - How do I aggregate multiple columns with one function in pandas when using groupby? 如何通过熊猫DataFrame的多列groupby替换行? - How to replace rows by their means in a multiple columns groupby pandas DataFrame? 在 pandas 中使用 groupby function,我如何创建新的 dataframe 列来保存每个 groupby“级别”的总和 - Using the groupby function in pandas, how can I create new dataframe columns that hold sums for each groupby "level" 使用熊猫数据框如何聚合和分组并引入非聚合/分组列 - Using a pandas dataframe how to aggregate and groupby and bring in non aggregated/groupby columns 使用groupby / aggregate返回多列 - using groupby/aggregate to return multiple columns
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM