[英]Spark SQL - Sum multiple columns in single query JAVA
I have over 50 columns for which i want to calculate sum using spark SQL. 我有超过50列要使用Spark SQL计算总和。 I don't want to manually write each column name.
我不想手动写每个列名。 How can I do it programmatically
如何以编程方式进行
Something like 就像是
val addNums = df.columns.map(case (c) => df(c))
.reduce(_ + _)
val sumDF = df.select(expr(addNums).as("SumOfFifty"))
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.