简体   繁体   中英

Why is decimal.Parse() slower than (decimal)double.Parse()?

Why is it faster to parse with double.Parse() and cast to decimal rather than calling decimal.Parse() ?

Given the following;

string stringNumber = "18.34";

double dResult = 0d;
decimal mResult = 0m;
for (int i = 0; i < 9999999; i++)
{
    mResult = (decimal)double.Parse(stringNumber);
    mResult = decimal.Parse(stringNumber);
}

Results in the following metrics in VS2017 profiler (.NET framework v4.7); 性能指标

The cumulative double.Parse() and cast comes to 37.84% of the CPU usage vs decimal.Parse() 's 46.93% of the CPU. There is more of a difference there than can be easily put down to a difference in datatype size. Can anyone explain?

The app where this came up on the profiler takes 10+ days to run so this small difference equates to hours of runtime. It'd be good to understand why. I can see that decimal.Parse() calls out to oleaut32.dll but...wth?

double的实现源和十进制的实现开始 ,看起来decimal.Parse()处理原位精度,而double.Parse()被优化以尽可能多地处理整数。

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM