简体   繁体   中英

really funky C# compiler behavior with nullable literals

The C# 4.0 compiler does not complain about this (not even a warning):

if(10.0 > null + 1)
{       
}

if (myDoubleValue > null)
{
}

And it seems to be always false. What is going on here? Is null automatically converted to Nullable<double> or something?

If so why doesn't this work then:

double myDoubleValue = null + 1;

Also, why would I ever want such a behavior. Why is it a good thing that it is possible playing around with literals like this.

The reason the assignment doesn't work is that the result is of type double? , not double . The result will always be the null value for the double? (aka Nullable<double> ) type.

Oh, and both of the first two blocks should make the compiler complain, with warnings like this:

Test.cs(7,19): warning CS0458: The result of the expression is always 'null' of type 'int?'
Test.cs(12,13): warning CS0464: Comparing with null of type 'double?' always produces 'false'

Managing to compile without errors isn't the same as not complaining :)

if(10.0 > null + 1)
{

}

is actually equivalent to:

int? i = null + 1; // i.HasValue = false in this case
if(10.0 > i)
{

}

so you are actually trying to compare a non nullable type to a nullable type which doesn't have value.

double myDoubleValue = null + 1;

doesn't compile because the type on the right hand is int? and not double .

Also is this question just out of curiosity in attempt to #%=^ the compiler or you are actually writing something like this in a real project?

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM