简体   繁体   中英

What is the proper way to calculate latency in omnet++?

I have written a simulation module. For measuring latency, I am using this:

simTime().dbl() - tempLinkLayerFrame->getCreationTime().dbl();

Is this the proper way ? If not then please suggest me or a sample code would be very helpful.

Also, is the simTime() latency is the actual latency in terms of micro seconds which I can write in my research paper? or do I need to scale it up?

Also, I found that the channel data rate and channel delay has no impact on the link latency instead if I vary the trigger duration the latency varies. For example

timer = new cMessage("SelfTimer");
scheduleAt(simTime() + 0.000000000249, timer);

If this is not the proper way to trigger simple module recursively then please suggest one.

Assuming both simTime and getCreationTime use the OMNeT++ class for representing time, you can operate on them directly, because that class overloads the relevant operators. Going with what the manual says, I'd recommend using a signal for the measurements (eg, emit(latencySignal, simTime() - tempLinkLayerFrame->getCreationTime()); ).

simTime() is in seconds, not microseconds.

Regarding your last question, this code will have problems if you use it for all nodes, and you start all those nodes at the same time in the simulation. In that case you'll have perfect synchronization of all nodes, meaning you'll only see collisions in the first transmission. Therefore, it's probably a good idea to add a random jitter to every newly scheduled message at the start of your simulation.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM