简体   繁体   中英

Arithmetic operations with 8 and 16 bit integers

I do understand why this produce compile time-error:

short x = 1;
short y = 2;
short z = x + y; // compile-time error

I've grasped why this runs without any issues:

short x = 1;
short y = 2;
x += y; // all right, because of c# specs section 7.17.2

But I've got no clue why this also works:

short x = (short)1 + (short)2;

I expected to get the same compile-time error as in the first example, but it runs successfully... Why?

Since you're using constant values, the compiler can detect that it's allowable, evaluate it at compile time , and let it execute. The generated IL evaluates to the same as typing short x = 3; .

Note the following also works (for the same reason):

const short x = 1;
const short y = 2;
short z = x + y; 

But this fails:

const short x = 32000;
const short y = 32001;
short z = x + y;

Note that this is covered in the C# Language Spec, 6.1.9 Implicit constant expression conversions:

  • A constant-expression (§7.19) of type int can be converted to type sbyte, byte, short, ushort, uint, or ulong, provided the value of the constant-expression is within the range of the destination type.

Your last snippet just compiles to constant 3 . Compiler doesn't need to call any operators in int it just computes and stores the value at compile time.

it is same as short x = 3;

Here is the generated IL

IL_0001:  ldc.i4.3    //Load the constant 3 into evaluation stack
IL_0002:  stloc.0     // stores the value in stack to x

I've got no clue why this also works:

 short x = (short)1 + (short)2; 

The compiler evaluates the rhs expression at compile time and can prove that the result is within bounds.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM