简体   繁体   中英

Spark dataframe not adding columns with null values

I am trying to create a new column by adding two existing columns in my dataframe.

Original dataframe

╔══════╦══════╗
║ cola ║ colb ║
╠══════╬══════╣
║ 1    ║ 1    ║
║ null ║ 3    ║
║ 2    ║ null ║
║ 4    ║ 2    ║
╚══════╩══════╝

Expected output with derived column

╔══════╦══════╦══════╗
║ cola ║ colb ║ colc ║
╠══════╬══════╬══════╣
║ 1    ║ 1    ║    2 ║
║ null ║ 3    ║    3 ║
║ 2    ║ null ║    2 ║
║ 4    ║ 2    ║    6 ║
╚══════╩══════╩══════╝

When I use df = df.withColumn('colc',df.cola+df.colb), it doesn't add columns with null values.

The output I get is:

╔══════╦══════╦══════╗
║ cola ║ colb ║ colc ║
╠══════╬══════╬══════╣
║ 1    ║ 1    ║ 2    ║
║ null ║ 3    ║ null ║
║ 2    ║ null ║ null ║
║ 4    ║ 2    ║ 6    ║
╚══════╩══════╩══════╝

Is there any way to incorporate the null values into the calculation. Any help would be appreciated.

Replace null with 0 using coalesce function and then add the two columns together; With selectExpr and sql syntax:

df.selectExpr('*', 'coalesce(cola, 0) + coalesce(colb, 0) as colc')

You can coalesce to 0 to get a sum. For cases where both columns are null, you can make use of conditional functions.

For your case, the code should look something like

df.selectExpr('*', 'if(isnull(cola) and isnull(colb), null, coalesce(cola, 0) + coalesce(colb, 0)) as colc')

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM