[英]Spark DataFrame extract value from array with where
I have a dataframe with the following schema:我有一个具有以下架构的 dataframe:
root
|-- id: long (nullable = true)
|-- raw_data: struct (nullable = true)
| |-- address_components: array (nullable = true)
| | |-- element: struct (containsNull = true)
| | | |-- long_name: string (nullable = true)
| | | |-- short_name: string (nullable = true)
| | | |-- types: array (nullable = true)
| | | | |-- element: string (containsNull = true)
Example of address_components
: address_components
示例:
{
"address_components":[
{
"long_name":"Portugal",
"short_name":"PT",
"types":[
"country",
"political"
]
},
{
"long_name":"8200-591",
"short_name":"8200-591",
"types":[
"postal_code"
]
}
]
}
I want to create a new root level attribute: Country: string
that should contain PT
.我想创建一个新的根级属性: Country: string
that should contain PT
。 However, the selection should be based on array_contains(col("types"), "country")
但是,选择应基于array_contains(col("types"), "country")
I figured part of it out like this:我想出一部分是这样的:
df = df.withColumn("country", expr("filter(raw_data.address_components, c -> array_contains(c.types, 'country'))"))
.withColumn("country", col("country").getItem(0).getItem("long_name"))
is there a smarter/shorter way to do this?有没有更聪明/更短的方法来做到这一点?
I fixed it using expressions in combination with withColumn:我使用表达式结合 withColumn 修复了它:
df = df.withColumn("country", expr("filter(raw_data.address_components, c -> array_contains(c.types, 'country'))[0].short_name"))
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.