简体   繁体   中英

Decimal out of range exception

In my C# application, I am trying to save a decimal price into a SQL Server table. Columns type is decimal with no total digits defined.

Without discount calculations, everything works fine.

But when I run some calculations, I get a final value of 21800 , and I get the following error when trying to save it.

"Parameter value '218000/00000000000000' is out of range."

I don't understand where the extra zeros come from! The only thing I know is with myValue.ToString() I get 218000/00000000000000 , too!

I understand that the digits after floating point are caused by the calculations. But whatever they do, my value is 21800 when watching it. Why is it trying to save 218000/00000000000000 ? Why does it care at all about the zeros?

Any ideas why this happens and how to fix it?

There are two ways to store decimal numbers:

  • Fixed-Point: This is what you have defined in your SQL Server project. You defined decimal (19,4) , which means 19 digits total and 4 digits after the decimal point. If you define such a type, then every number always has 4 digits after the decimal point; and also, you cannot store a number with more than 4 digits after the decimal point. And note, there is no equivalent to fixed point in C#!

  • Floating Point: These are the types float , double and decimal in C#, and the type float(n) in SQL Server. In floating point, as the names says, you can basically vary the number of digits before and behind the decimal point. (Technically, you have a fixed number of digits, the so-called mantissa, and you can then move the decimal point around with a 2nd number, the exponent)

So now to your project: In your database, you use fixed point , while in C#, you have floating point . Therefore, when you calculate something in C#, and you want to store in to your database, then it is always converted from floating point to fixed point. However, if your number in C# does not fit the decimal numbers in the database, then you get the out of range error.

I had to specify the total digits of my decimal column in the SQL table (I set it as decimal (19, 4) ).

Besides, my C# representing field also needed to have the right precision count.

I still don't know about the extra zeros and why SQL care about them. Any explanations is appreciated.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM