简体   繁体   中英

Convert.ChangeType messing up return value

We have a validation in place for years in our code base which was doing just fine until now. The issue started happening when values up to four decimal places started coming into the picture. To visualize review the following line of code;

//upto 3 decimal, returns just fine
//temp is equal to 1000.003
var temp = System.Convert.ChangeType("1000.003", TypeCode.Single);

//Test for values upto 4 decimals
//Iteration:1 
//Supplied 1000.0001 Return 1000.00012
var temp1 = System.Convert.ChangeType("1000.0001", TypeCode.Single);

//Iteration:2
//Supplied 1000.0004 Return 1000.00043
var temp2 = System.Convert.ChangeType("1000.0004", TypeCode.Single);

//Iteration:3
//Supplied 1000.0007 Return 1000.00067
var temp3 = System.Convert.ChangeType("1000.0007", TypeCode.Single);

Why is it acting this way? And another thing is if I change the TypeCode to double then precision is saved, why?

From MSDN :

A Single value has up to 7 decimal digits of precision, although a maximum of 9 digits is maintained internally.

You're running into cases where the number you pass in can't be represented exactly as a Single , so it gives you the closest Single value it can. If the decimal representation must be preserved exactly then Decimal is a more appropriate type.

Note that the fact that there are four digits after the decimal is irrelevant. You'd have the same issue with numbers like 100,000,010

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM