简体   繁体   中英

How to change the value of a column according to the value of another column in a spark DataFrame

I am starting with this dataframe

DF1
+----+-------+-------+-------+
|name | type  |item1 | item2 |
+-----+-------+------+-------+
|apple|fruit  |apple1|apple2 |
|beans|vege   |beans1|beans2 |
|beef |meat   |beef1 |beef2  |
|kiwi |fruit  |kiwi1 |kiwi2  |
|pork |meat   |pork1 |pork2  |
+-----+-------+--------------+

Now I want to populate a column called "prop" based on the column value of the "type" column as in DF2. For example,

If "type"== "fruit" then "prop"="item1"
If "type"== "vege" then "prop"="item1"
If "type"== "meat" then "prop"="item2"

What is the best way to get this? I was thinking to filter based on each "type" populate the "prop" column and then concatenate the resulting dataframes. That doesn't seem very efficient.

DF2
+----+-------+-------+-------+-------+
|name | type  |item1 | item2 | prop  |
+-----+-------+------+-------+-------+
|apple|fruit  |apple1|apple2 |apple1 |
|beans|vege   |beans1|beans2 |beans1 |
|beef |meat   |beef1 |beef2  |beef2  |
|kiwi |fruit  |kiwi1 |kiwi2  |kiwi1  |
|pork |meat   |pork1 |pork2  |pork2  |
+-----+-------+--------------+-------+

Use when+otherwise statements for this case which are very efficient in Spark .

//sample data
df.show()
//+-----+-----+------+------+
//| name| type| item1| item2|
//+-----+-----+------+------+
//|apple|fruit|apple1|apple2|
//|beans| vege|beans1|beans2|
//| beef| meat| beef1| beef2|
//| kiwi|fruit| kiwi1| kiwi2|
//| pork| meat| pork1| pork2|
//+-----+-----+------+------+

//using isin function
df.withColumn("prop",when((col("type").isin(Seq("vege","fruit"):_*)),col("item1")).when(col("type") === "meat",col("item2")).otherwise(col("type"))).show()

df.withColumn("prop",when((col("type") === "fruit") ||(col("type") === "vege"),col("item1")).when(col("type") === "meat",col("item2")).
otherwise(col("type"))).
show()
//+-----+-----+------+------+------+
//| name| type| item1| item2|  prop|
//+-----+-----+------+------+------+
//|apple|fruit|apple1|apple2|apple1|
//|beans| vege|beans1|beans2|beans1|
//| beef| meat| beef1| beef2| beef2|
//| kiwi|fruit| kiwi1| kiwi2| kiwi1|
//| pork| meat| pork1| pork2| pork2|
//+-----+-----+------+------+------+

It can be done by chaining when and otherwise as below

import org.apache.spark.sql.functions._

object WhenThen {

  def main(args: Array[String]): Unit = {
    val spark = Constant.getSparkSess


    import spark.implicits._
    val df = List(("apple","fruit","apple1","apple2"),
      ("beans","vege","beans1","beans2"),
      ("beef","meat","beef1","beans2"),
      ("kiwi","fruit","kiwi1","beef2"),
      ("pork","meat","pork1","pork2")
    ).toDF("name","type","item1","item2" )

   df.withColumn("prop",
      when($"type" === "fruit", $"item1").otherwise(
        when($"type" === "vege", $"item1").otherwise(
          when($"type" === "meat", $"item2").otherwise("")
        )
      )).show()
  }

}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM