简体   繁体   中英

0 vs 0.0 in conditional statements

Is there a difference between:

double dandouble = 5.23493; //or some other random number

if (dandouble < 0.0)
    dandouble = 3.5;

and

double dandouble = 5.23493; //or some other random number

if (dandouble < 0)
    dandouble = 3.5;

Or will they turn out to result the same?

The compiler must emit the Opcodes.Clt IL instruction to make the comparison. The CLI spec dictates the acceptable arguments for the instruction, double and int are not permitted. Aware of those rules, the compiler promotes an argument to get to a valid combination, double and double are the first match. It has enough smarts to recognize that the int argument is a literal. So doesn't emit any IL to make the conversion, it directly emits an Opcodes.Ldc_R8 for 0.0

No difference.

No difference. They are the same.

double a = 0.0000000000001;
int b = 0;

res = a <= b; // False
res2 = b >= a; // False

Given the above test, I'd say C# opts for the least lossy conversion. (Not a preference for left or right side)

So to answer your question, no. There isn't a difference.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM