Why is it faster to parse with double.Parse()
and cast to decimal
rather than calling decimal.Parse()
?
Given the following;
string stringNumber = "18.34";
double dResult = 0d;
decimal mResult = 0m;
for (int i = 0; i < 9999999; i++)
{
mResult = (decimal)double.Parse(stringNumber);
mResult = decimal.Parse(stringNumber);
}
Results in the following metrics in VS2017 profiler (.NET framework v4.7);
The cumulative double.Parse()
and cast comes to 37.84% of the CPU usage vs decimal.Parse()
's 46.93% of the CPU. There is more of a difference there than can be easily put down to a difference in datatype size. Can anyone explain?
The app where this came up on the profiler takes 10+ days to run so this small difference equates to hours of runtime. It'd be good to understand why. I can see that decimal.Parse()
calls out to oleaut32.dll
but...wth?
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.