[英]Including null inside PySpark isin
This is my dataframe:这是我的 dataframe:
from pyspark.sql import SparkSession
from pyspark.sql import functions as F
spark = SparkSession.builder.getOrCreate()
dCols = ['c1', 'c2']
dData = [('a', 'b'),
('c', 'd'),
('e', None)]
df = spark.createDataFrame(dData, dCols)
Is there a syntax to include null
inside .isin()
?是否有在
.isin()
中包含null
的语法?
Something like就像是
df = df.withColumn(
'newCol',
F.when(F.col('c2').isin({'d', None}), 'true') # <=====?
.otherwise('false')
).show()
After executing the code I get执行代码后我得到
+---+----+------+
| c1| c2|newCol|
+---+----+------+
| a| b| false|
| c| d| true|
| e|null| false|
+---+----+------+
instead of代替
+---+----+------+
| c1| c2|newCol|
+---+----+------+
| a| b| false|
| c| d| true|
| e|null| true|
+---+----+------+
I would like to find a solution where I would not need to reference the same column twice, as we need to do now:我想找到一个不需要两次引用同一列的解决方案,就像我们现在需要做的那样:
(F.col('c2') == 'd') | F.col('c2').isNull()
NULL
is not a value but represents the absence of a value so you can't compare it to None or NULL. NULL
不是一个值,但表示没有值,因此您无法将其与 None 或 NULL 进行比较。 The comparison will always give false.比较总是会给出错误的。 You need to use
isNull
to check:您需要使用
isNull
来检查:
df = df.withColumn(
'newCol',
F.when(F.col('c2').isin({'d'}) | F.col('c2').isNull(), 'true')
.otherwise('false')
).show()
#+---+----+------+
#| c1| c2|newCol|
#+---+----+------+
#| a| b| false|
#| c| d| true|
#| e|null| true|
#+---+----+------+
One reference to the column is not enough in this case.在这种情况下,仅引用该列是不够的。 To check for nulls you need to use a separate
isNull
method.要检查空值,您需要使用单独的
isNull
方法。
Also, if you want a column of true/false
, you can cast the result to Boolean directly without using when
:此外,如果您想要一列
true/false
,您可以直接将结果转换为 Boolean ,而无需使用when
:
import pyspark.sql.functions as F
df2 = df.withColumn(
'newCol',
(F.col('c2').isin(['d']) | F.col('c2').isNull()).cast('boolean')
)
df2.show()
+---+----+------+
| c1| c2|newCol|
+---+----+------+
| a| b| false|
| c| d| true|
| e|null| true|
+---+----+------+
Try this: use the 'or' operation to test for nulls试试这个:使用“或”操作来测试空值
from pyspark.sql import SparkSession
from pyspark.sql import functions as F
import numpy as np
spark = SparkSession.builder.getOrCreate()
dCols = ['c1', 'c2']
dData = [('a', 'b'),
('c', 'd'),
('e', None)]
df = spark.createDataFrame(dData, dCols)
df = df.withColumn(
'newCol',
F.when(F.col('c2').isNull() | (F.col('c2') == 'd'), 'true') #
.otherwise('false')
).show()
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.