简体   繁体   English

如何对 boolean 列的 Spark dataframe 使用 when 和 Otherwise 语句?

[英]How to use when and Otherwise statement for a Spark dataframe by boolean columns?

I have a dataset with three columns, col 1: country (String), col 2: threshold_1 (bool), col 3: threshold_2 (bool)我有一个包含三列的数据集,col 1:country(String),col 2:threshold_1(bool),col 3:threshold_2(bool)

I am trying to create a new column with this logic, but getting an error我正在尝试使用此逻辑创建一个新列,但出现错误

I am using the Palantir code workbook for this, can anyone tell me what I am missing here?我正在为此使用 Palantir 代码工作簿,谁能告诉我我在这里缺少什么?

df = df.withColumn("Threshold_Filter", 
        when(df["country"]=="INDIA" & df["threshold_1"]==True | df["threshold_2 "]==True, "Ind_country"
     ).otherwise("Dif_country"))

You just need to put your statements in parentheses.你只需要把你的陈述放在括号里。

df = (
    df
    .withColumn(
        "Threshold_Filter",
        when(
            (df["country"]=="INDIA") & 
            (df["threshold_1"]==True) | 
            (df["threshold_2 "]==True), 
            "Ind_country")
        .otherwise("Dif_country"))
)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何使用 Spark Dataframe API 在 Case-Otherwise 语句上应用多个条件 - How To Apply Multiple Conditions on Case-Otherwise Statement Using Spark Dataframe API Spark Dataframe withColumn 否则条件在 Spark Java 中不起作用 - Spark Dataframe withColumn When Otherwise condition Not Working in Spark Java 在2个Spark数据框列之间使用“ IS IN” - Use “IS IN” between 2 Spark dataframe columns 在 Spark SQL 中运行时动态传递列 - Dynamically pass columns into when otherwise functions in Spark SQL Java-Spark:如何获取数据集 <Row> 在循环中迭代时的列值,并在when()。otherwise()中使用它? - Java-Spark: how to get a Dataset<Row> column's value when iterating in a loop and use it in when().otherwise()? 如何过滤出Spark数据框中的布尔字段? - how to filter out boolean fields in spark dataframe? 如何通过布尔列过滤 Spark 数据框? - How to filter a Spark dataframe by a boolean column? 当DataFrame有列时如何使用Java Apache Spark MLlib? - How to work with Java Apache Spark MLlib when DataFrame has columns? Spark:如何按列对数据框进行分组 - Spark: How to group the dataframe in columns 使用 Spark DataFrame groupby 时如何获取其他列? - How to get other columns when using Spark DataFrame groupby?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM