简体   繁体   中英

Simulate packet loss in UDP in python

I am supposed to simulate a packet loss rate of 10^-2 in a Stop-and-wait protocol, that is 0.01, which means that 1 out of 100 packets transmitted get lost. Suppose I'm sending 1000 packets, how to drop exactly 1 random packet out of the 100 packets sent throughout the transmission ?

Having a rate of 0.01 doesn't mean that exactly 1 out of 100 packets gets dropped. It means that each packet has a 1% chance of getting lost. Under the assumption that losses are independent of each other, the actual number of lost packets will follow a binomial distribution .

For each packet you generate, check if a random Uniform(0,1) is less than or equal to the proportion of losses p , in your case 0.01. If it is, that packet is lost, otherwise it goes through. This approach scales if you increase or decrease the N , the total number of packets. The expected number of losses will be N * p , but if you repeat the experiment multiple times there will be variability.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM