[英]How to parse json to get all values of a specific key within an array?
I'm having trouble trying to get a list of values from a specific key inside an json array using python. 我在尝试使用python从json数组内的特定键中获取值列表时遇到麻烦。 Using the JSON example below, I am trying to create a list which consists only the values of the
name
key. 使用下面的JSON示例,我试图创建一个仅包含
name
键值的列表。 Original JSON: 原始JSON:
[
{
"id": 1,
"name": "Bulbasaur",
"type": [
"grass",
"poison"
]
},
{
"id": 2,
"name": "Ivysaur",
"type": [
"grass",
"poison"
]
}
]
Expected: 预期:
["Bulbasaur", "Ivysaur"]
Below is the code of my approach: 下面是我的方法的代码:
import json
try:
with open("./simple.json", 'r') as f:
contents = json.load(f)
except Exception as e:
print(e)
print(contents[:]["name"])
I'm trying to go to an approach where i don't need to loop every single index and append them, something like the code above. 我正在尝试一种不需要循环每个索引并附加它们的方法,就像上面的代码一样。 Is this approach possible using python' json library?
这种方法可以使用python的json库吗?
You cannot do contents[:]["name"]
since contents
is a list is a dictionary with integer indexes, and you cannot access an element from it using a string name
. 您不能执行
contents[:]["name"]
因为contents
是一个列表,是具有整数索引的字典,并且无法使用字符串name
来访问它的元素。
To fix that, you would want to iterate over the list and get the value for key name
for each item
要解决此问题,您需要遍历列表并获取每个
item
键name
值
import json
contents = []
try:
with open("./simple.json", 'r') as f:
contents = json.load(f)
except Exception as e:
print(e)
li = [item.get('name') for item in contents]
print(li)
The output will be 输出将是
['Bulbasaur', 'Ivysaur']
尝试使用列表推导 :
print([d["name"] for d in contents])
This is not a real answer to the question. 这不是对该问题的真正答案。 The real answer is to use a list comprehension.
真正的答案是使用列表理解。 However, you can make a class that allows you to use specifically the syntax you tried in the question.
但是,您可以创建一个类,使您可以专门使用在问题中尝试过的语法。 The general idea is to subclass
list
so that a slice like [:]
returns a special view (another class) into the list. 通常的想法是对
list
进行子类化,以便像[:]
这样的切片将特殊视图(另一个类)返回到列表中。 This special view will then allow retrieval and assignment from all the dictionaries simultaneously. 然后,该特殊视图将允许同时从所有词典中进行检索和分配。
class DictView:
"""
A special class for getting and setting multiple dictionaries
simultaneously. This class is not meant to be instantiated
in its own, but rather in response to a slice operation on UniformDictList.
"""
def __init__(parent, slice):
self.parent = parent
self.range = range(*slice.indices(len(parent)))
def keys(self):
"""
Retreives a set of all the keys that are shared across all
indexed dictionaries. This method makes `DictView` appear as
a genuine mapping type to `dict`.
"""
key_set = set()
for k in self.range:
key_set &= self.parent.keys()
return key_set
def __getitem__(self, key):
"""
Retreives a list of values corresponding to all the indexed
values for `key` in the parent. Any missing key will raise
a `KeyError`.
"""
return [self.parent[k][key] for k in self.range]
def get(self, key, default=None):
"""
Retreives a list of values corresponding to all the indexed
values for `key` in the parent. Any missing key will return
`default`.
"""
return [self.parent[k].get(key, default) for k in self.range]
def __setitem__(self, key, value):
"""
Set all the values in the indexed dictionaries for `key` to `value`.
"""
for k in self.range:
self.parent[k][key] = value
def update(self, *args, **kwargs):
"""
Update all the indexed dictionaries in the parent with the specified
values. Arguments are the same as to `dict.update`.
"""
for k in self.range:
self.parent[k].update(*args, **kwargs)
class UniformDictList(list):
def __getitem__(self, key):
if isinstance(key, slice):
return DictView(self, key)
return super().__getitem__(key)
Your original code would now work out of the box with just one additional wrap in UniformDictList
: 现在,您只需使用
UniformDictList
一个额外包装即可立即使用原始代码:
import json
try:
with open("./simple.json", 'r') as f:
contents = UniformDictList(json.load(f))
except Exception as e:
print(e)
print(contents[:]["name"])
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.