I think latency refers to execution "speed" when bounded by some time constant (this function cannot take more than X milliseconds to finish execution), but I don't really understand the different between both. Doesn't a faster function have a lower latency? Doesn't lowering the latency increases its speed? Doesn't those concepts imply each other?
I have tried reading definitions of both concepts but haven't really get it yet, so, in order to understand better the difference between both, could you provide a real-world problem where (and why):
Also, I have the feeling that both concepts are used with slightly different meanings in the world of networking and traditional "execution speed" (in high-frequency trading for example). Is that right?
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.