简体   繁体   English

将 pyspark dataframe 中的多列转换为一个字典

[英]Convert multiple columns in pyspark dataframe into one dictionary

I have created a PySpark DataFrame like this one: '''我创建了一个像这样的 PySpark DataFrame:'''

df = spark.createDataFrame([
    ('v', 3, 'a'),
    ('d', 2, 'b'),
    ('q', 9, 'c')],
    ["c1", "c2", "c3"]
)

df.show()

c1  | c2 | c3
v   |  3 | a
d   |  2 | b
q   |  9 | c

I want to create a new column like this:我想创建一个这样的新列:

+--------------------------+
|           c4             |
+--------------------------+
|{"c1":"v","c2":3,"c3":"a"}|
|{"c1":"d","c2":2,"c3":"b"}|
|{"c1":"q","c2":9,"c3":"c"}|
+--------------------------+

I want c4 to be in type of MapType , not StringType Also, I want to keep the type of values as it is.我希望c4MapType类型,而不是StringType另外,我想保持值的类型不变。 (keep 3,2 and 9 as integers, not String) (将 3,2 和 9 保留为整数,而不是字符串)

Use struct + to_json like this if you want to get JSON strings:如果要获取 JSON 字符串,请像这样使用struct + to_json

import pyspark.sql.functions as F

df1 = df.select(
    F.to_json(
        F.struct(*[F.col(c) for c in df.columns])
    ).alias("c4")
)

df1.show(truncate=False)
#+--------------------------+
#|c4                        |
#+--------------------------+
#|{"c1":"v","c2":3,"c3":"a"}|
#|{"c1":"d","c2":2,"c3":"b"}|
#|{"c1":"q","c2":9,"c3":"c"}|
#+--------------------------+

EDIT编辑

If you want a MapType column use create_map function:如果您想要 MapType 列,请使用create_map function:

from itertools import chain

df1 = df.select(
    F.create_map(
        *list(chain(*[[F.col(c), F.lit(c)] for c in df.columns]))
    ).alias("c4")
)

#+---------------------------+
#|c4                         |
#+---------------------------+
#|{v -> c1, 3 -> c2, a -> c3}|
#|{d -> c1, 2 -> c2, b -> c3}|
#|{q -> c1, 9 -> c2, c -> c3}|
#+---------------------------+

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM