I have this JSON object response after performing Logstash aggregation.
"aggregations": {
"range": {
"buckets": [
{
"key": "2018-01-01T00:00:00.000Z-2018-01-31T00:00:00.000Z",
"from_as_string": "2018-01-01T00:00:00.000Z",
"to_as_string": "2018-01-31T00:00:00.000Z",
"doc_count": 13000,
"by ip": {
"doc_count_error_upper_bound": 10,
"sum_other_doc_count": 10300,
"buckets": [
{
"key": "192.168.0.1",
"doc_count": 20 <---
.
.
. (Sub-buckets for other fields containing doc_count as
well)
},
{
"key": "1.2.3.4",
"doc_count": 50 <---
}
}
]
}
}
]
}
}
I would like to rename the "doc_count" key to "Number of unique events". However I only want this just for the doc_count of the IP Addresses buckets, not any other bucket fields.
This was one of the solutions that I found, but it renamed doc_count for every field buckets.
def rename_doc_count(obj):
for key in obj.keys():
new_key = key.replace("doc_count","Number of unique events")
if new_key != key:
obj[new_key] = obj[key]
del obj[key]
return obj
new_json = json.loads(jres, object_hook=rename_doc_count)
Anyone has a solution that can fit my situation? Thanks!
Do it after you load the JSON, so you can call the function on just the element you want.
obj = json.loads(jres);
for bucket in obj["aggregations"]["range"]["buckets"]:
for bucket2 in bucket["by ip"]["buckets"]:
rename_doc_count(bucket2)
And there's no need for a loop in rename_doc_count
, just fetch the specific dictionary element.
def rename_doc_count(obj):
if "doc_count" in obj:
obj["Number of unique events"] = obj["doc_count"]
del obj["doc_count"]
You mention that there can be sub-buckets that should also be renamed. You can do that by having rename_doc_count
call itself recursively. But you didn't show where those are in the structure, so I don't know precisely what it should look like. But it's similar to the loop after json.loads()
.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.