简体   繁体   中英

Scala add new column to dataframe by expression

I am going to add new column to a dataframe with expression. for example, I have a dataframe of

+-----+----------+----------+-----+
| C1  | C2       |   C3     |C4   |
+-----+----------+----------+-----+
|steak|1         |1         |  150|
|steak|2         |2         |  180|
| fish|3         |3         |  100|
+-----+----------+----------+-----+

and I want to create a new column C5 with expression "C2/C3+C4", assuming there are several new columns need to add, and the expressions may be different and come from database.

Is there a good way to do this?

I know that if I have an expression like "2+3*4" I can use scala.tools.reflect.ToolBox to eval it.

And normally I am using df.withColumn to add new column.

Seems I need to create an UDF, but how can I pass the columns value as parameters to UDF? especially there maybe multiple expression need different columns calculate.

This can be done using expr to create a Column from an expression:

val df = Seq((1,2)).toDF("x","y")

val myExpression = "x+y"

import org.apache.spark.sql.functions.expr

df.withColumn("z",expr(myExpression)).show()

+---+---+---+
|  x|  y|  z|
+---+---+---+
|  1|  2|  3|
+---+---+---+

Two approaches:

    import spark.implicits._ //so that you could use .toDF
    val df = Seq(
      ("steak", 1, 1, 150),
      ("steak", 2, 2, 180),
      ("fish", 3, 3, 100)
    ).toDF("C1", "C2", "C3", "C4")

    import org.apache.spark.sql.functions._

    // 1st approach using expr
    df.withColumn("C5", expr("C2/(C3 + C4)")).show()

    // 2nd approach using selectExpr
    df.selectExpr("*", "(C2/(C3 + C4)) as C5").show()

+-----+---+---+---+--------------------+
|   C1| C2| C3| C4|                  C5|
+-----+---+---+---+--------------------+
|steak|  1|  1|150|0.006622516556291391|
|steak|  2|  2|180| 0.01098901098901099|
| fish|  3|  3|100| 0.02912621359223301|
+-----+---+---+---+--------------------+

In Spark 2.x, you can create a new column C5 with expression "C2/C3+C4" using withColumn() and org.apache.spark.sql.functions._ ,

 val currentDf = Seq(
              ("steak", 1, 1, 150),
              ("steak", 2, 2, 180),
              ("fish", 3, 3, 100)
            ).toDF("C1", "C2", "C3", "C4")

 val requiredDf = currentDf
                   .withColumn("C5", (col("C2")/col("C3")+col("C4")))

Also, you can do the same using org.apache.spark.sql.Column as well. (But the space complexity is bit higher in this approach than using org.apache.spark.sql.functions._ due to the Column object creation)

 val requiredDf = currentDf
                   .withColumn("C5", (new Column("C2")/new Column("C3")+new Column("C4")))

This worked perfectly for me. I am using Spark 2.0.2.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM