[英]How to use a column value as key to a dictionary in PySpark?
I have a small PySpark DataFrame df
:我有一个小的 PySpark DataFrame
df
:
index col1
0 1
1 3
2 4
And a dictionary:还有一本字典:
LOOKUP = {0: 2, 1: 5, 2: 5, 3: 4, 4: 6}
I now want to add an extra column col2
to df
, equal to the LOOKUP
values of col1
.我现在想向
df
添加一个额外的列col2
,等于col1
的LOOKUP
值。
My output should look like this:我的 output 应该是这样的:
index col1 col2
0 1 5
1 3 4
2 4 6
I tried using:我尝试使用:
df = df.withColumn(col("col2"), LOOKUP[col("col1")])
But this gave me errors, as well as using expr
.但这给了我错误,以及使用
expr
。
How to achieve this in PySpark?如何在 PySpark 中实现这一点?
You can use a map
column that you create from the lookup
dictionary:您可以使用从
lookup
字典创建的map
列:
from itertools import chain
from pyspark.sql import functions as F
lookup = {0: 2, 1: 5, 2: 5, 3: 4, 4: 6}
lookup_map = F.create_map(*[F.lit(x) for x in chain(*lookup.items())])
df1 = df.withColumn("col2", lookup_map[F.col("col1")])
df1.show()
#+-----+----+----+
#|index|col1|col2|
#+-----+----+----+
#| 0| 1| 5|
#| 1| 3| 4|
#| 2| 4| 6|
#+-----+----+----+
Another way would be to create a lookup_df
from the dict then join with your dataframe另一种方法是从dict创建一个
lookup_df
,然后加入你的dataframe
You van use a CASE WHEN
statement with python f-strings
here with the LOOKUP
dictionary:您可以在此处使用带有 python
f-strings
的CASE WHEN
语句和LOOKUP
字典:
from pyspark.sql import functions as F
column = 'col1' #column to replace
e = f"""CASE {' '.join([f"WHEN {column}='{k}' THEN '{v}'" for k,v in LOOKUP.items()])}
ELSE NULL END"""
out = df.withColumn("col2",F.expr(e))
out.show()
+-----+----+----+
|index|col1|col2|
+-----+----+----+
| 0| 1| 5|
| 1| 3| 4|
| 2| 4| 6|
+-----+----+----+
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.