[英]How do we add column value in spark using Scala?
I have data like this我有这样的数据
+-------------+--------+--------+
| COl1| Col2| COL3|
+-------------+--------+--------+
|A .......... | 56102| 45991|
|B........... | 25336| 23099|
+-------------+--------+--------+
it should be like this应该是这样的
+-------------+--------+--------+
| COl1| Col2| COL3|
+-------------+--------+--------+
|A .......... | 56102| 45991|
|B........... | 25336| 23099|
|Total....... | 58368| 69090|
+-------------+--------+--------+
need a row with Total and the value should be the sum of reaming row in the dataframe.需要一行总和,该值应该是数据框中铰孔行的总和。
You can use aggregation functions to compute the sums, and a union to append them at the end of the original df.您可以使用聚合函数来计算总和,并使用联合将它们附加到原始 df 的末尾。 For it to work, you just need to make sure that the names of the columns coincide.要使其工作,您只需要确保列的名称一致。
It would go like this:它会是这样的:
val df = Seq(("A", 56102, 45991), ("B", 25336, 23099))
.toDF("COL1", "COL2", "COL3")
val sums = df.select(lit("Total") as "COL1", sum('COL2) as "COL2", sum('COL3) as "COL3")
df.union(sums).show()
+-----+-----+-----+
| COL1| COL2| COL3|
+-----+-----+-----+
| A|56102|45991|
| B|25336|23099|
|Total|81438|69090|
+-----+-----+-----+
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.