I'm working in a simple program, for beginning i will ping in all MAN hosts to verify if all hosts are online (complete) but i want implement some way that measure latency between hosts. Is there any way to do that? Any tips?
Anyway, thank you
You can keep a timestamp for the ping
and the pong
packed and simply compute the difference between the two.
This is by definition, latency
You can repeat the process more that one time to compute other metric, such as the jitter.
Something simple as the following should serve the purpose.
while (sentPacket < MAX_PACK_NUM) {
// Timestamp in ms when we send it
Date now = new Date();
long msSend = now.getTime();
....
//send ping
socket.send(ping);
//receive ping
socket.receive(response);
now = new Date();
long msReceived = now.getTime();
// Print the packet and the delay
long latency= msReceived - msSend;
++sentPacket;
}
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.