[英]pyspark sql with where clause throws column does not exist error
[英]Update column with a where clause in Pyspark
如何使用where子句更新Pyspark數據框中的列?
這類似於此SQL操作:
UPDATE table1 SET alpha1= x WHERE alpha2< 6;
其中alpha1和alpha2是table1的列。
例如:我有一個數據框table1,其值如下:
table1 alpha1 alpha2 3 7 4 5 5 4 6 8 dataframe Table1 after update : alpha1 alpha2 3 7 x 5 x 4 6 8
如何在pyspark數據框中執行此操作?
您正在尋找when函數:
df = spark.createDataFrame([("3",7),("4",5),("5",4),("6",8)],["alpha1", "alpha2"])
df.show()
>>> +------+------+
>>> |alpha1|alpha2|
>>> +------+------+
>>> | 3| 7|
>>> | 4| 5|
>>> | 5| 4|
>>> | 6| 8|
>>> +------+------+
df2 = df.withColumn("alpha1", pyspark.sql.functions.when(df["alpha2"] < 6, "x").otherwise(df["alpha1"]))
df2.show()
>>>+------+------+
>>>|alpha1|alpha2|
>>>+------+------+
>>>| 3| 7|
>>>| x| 5|
>>>| x| 4|
>>>| 6| 8|
>>>+------+------+
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.