简体   繁体   中英

Does system.Decimal use more memory than 'decimal'?

I heard someone say that in C#, capital Decimal is using more memory than lower case decimal, because Decimal is resolved to the lowercase decimal and that requires memory.

Is that true?

No.

decimal is simply an alias for System.Decimal . They're exactly the same and the alias is resolved at compile-time.

No, that is not true.

The decimal keyword is an alias for the type System.Decimal . They are the exact same type, so there is no memory difference and no performance difference. If you use reflection to look at the compiled code, it's not even possible to tell if the alias or the system type was used in the source code.

There is two differences in where you can use the alias and the system type, though:

  • The decimal alias is always the system type and can not be changed in any way. The use of the Decimal identifier relies on importing the System namespace. The unambiguous name for the system type is global::System.Decimal .

  • Some language constructs only accept the alias, not the type. I can't think of an example for decimal , but when specifying the underlying type for an enum you can only use language aliases like int , not the corresponing system type like System.Int32 .

No. That's just silly.

In C#, decimal is just a synonym for Decimal. The compiler will treat decimal declarations as Decimal, and the compiled code will be as if Decimal was used.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM