[英]Why does Visual Studio debugger shows a long hex value for an int?
While debugging I've come across this interesting behavior : 在调试时,我遇到了这种有趣的行为:
The hex value string of a
is twice as long as the others. 的十六进制值的字符串
a
两倍,只要别人。
Can you tell why is this happening ? 你能说出为什么会这样吗?
You are asking the debugger to evaluate the expression for you. 您正在要求调试器为您评估表达式。 It now acts like a compiler, converting the watch expression you entered into code and running that code to display the result.
现在,它就像一个编译器,将您输入的监视表达式转换为代码,然后运行该代码以显示结果。 It thinks that
0xff000000
is a literal of type long , which is a fair call since int cannot store that value, it is larger than Int32.MaxValue. 它认为
0xff000000
是long类型的文字,这是一个合理的选择,因为int无法存储该值,它大于Int32.MaxValue。 So it evaluates the >> operator with long arguments, converting the i
value to long first. 因此,它将使用长参数计算>>运算符,并将
i
值首先转换为long 。 The result is of course long as well. 结果当然也很长 。
Since you didn't otherwise cast to smaller type, like you did in your code, the debugger displays the result (when switched to hexadecimal output) as a long with 64 bits, 16 hex digits. 由于您没有像在代码中那样强制转换为较小的类型,因此调试器将结果(切换到十六进制输出时)显示为64位16进制数字的长整数。
The other expressions don't behave that way, the literals used in them are smaller than Int32.MaxValue so are evaluated with int arguments, producing a 32 bit result, 8 hex digits. 其他表达式的行为不一样,它们中使用的文字小于Int32.MaxValue,因此使用int参数求值,产生32位结果(8个十六进制数字)。
Notable perhaps is that the debugger's expression evaluator is close, but not identical to the C# compiler's evaluator. 值得注意的是,调试器的表达式评估器很接近,但与C#编译器的评估器并不相同。 Not an issue here, but it can matter in some cases.
这里不是问题,但在某些情况下可能很重要。 This may change some day when the Roslyn project finally ships.
当罗斯林项目最终交付时,这一天可能会改变。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.