简体   繁体   中英

Math.random Scenario

Example 1: 50 + (int)(Math.random() * 50) returns an integer between 50 and 99 . The maximum number in decimal form being 99.9 recurring.

Example 2: 34 + (int)(Math.random() * 21) returns an integer between 34 and 55 . The maximum number in decimal form being 54.9 recurring.

Why is it that example 2 returns 55 and not 54?

Math.random() returns a decimal in the range of [0.0, 1.0) ,

So your assumptions are correct in each example

For the first example, 50 + 49.9 = 99.9

For the second example, 34 + 20.9 = 54.9

I am not sure where you are getting 55 for example two because when the decimal is cast to an int, the decimal will lose its precision, effectively rounding it down to create the max of 99 for example 1 and 54 for example 2

Edit

The book is wrong, if you wanted the range to be 34 to 55, the code would have to be

34 + (int)(Math.random() * 22)

I assume this is from a textbook or tutorial? The second example is wrong. It should say "34 to 54", just like you thought it should.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM