I am currently recording latency of some write requests to the database just as a part of my internship. There are literally thousands of values like this that gets saved locally in a file named latency.json
(I know this is not correct JSON format), but anyways my question is I just need to find the average value of these, and I am supposed to write a python program for this and I have no idea because I am very very new to python.
I watched a few videos to but it is not helpful because I don't know how to isolate only the latency value.
{"Latency": 0.05749578899849439, "Date & Time": "2021-03-10T20:50:07.132809"}
{"Latency": 0.03988014299829956, "Date & Time": "2021-03-10T20:50:07.673860"}
{"Latency": 0.055852558005426545, "Date & Time": "2021-03-10T20:50:08.230857"}
{"Latency": 0.04969122799957404, "Date & Time": "2021-03-10T20:50:08.781738"}
{"Latency": 0.04796638499828987, "Date & Time": "2021-03-10T20:50:09.330938"}
{"Latency": 0.043185365000681486, "Date & Time": "2021-03-10T20:50:10.022725"}
{"Latency": 0.0398543819974293, "Date & Time": "2021-03-10T20:50:10.563757"}
If the format is JSON Lines
then you might want to try this:
import json
with open("lines.jsonl") as lines:
d = [json.loads(line) for line in lines.readlines()]
print(f"Average for {len(d)} requests: {sum(i['Latency'] for i in d) / len(d)}")
Output:
Average for 7 requests: 0.04770369285688503
Simple approach should work:
import json
s = 0
n = 0
for line in open('data_file.dat'):
line = line.strip()
if line == "":
continue
d = json.loads(line)
n += 1
s += d["Latency"]
print(s/n)
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.