简体   繁体   中英

Which c datatype should be use for decimal numbers?

I need to use a field of type numeric(11,9) in mysql. I'm using C api so which data type I should use. I used long so it gives result like these:

12.123456789 to 12.123457000

I don't want to lose precision. I think I should use long double - but what specifier ( %_ ) should I use? For long it's %f , but what it's for long double?

long double d = 12.123456789;
printf ("%.9lf", d);

whenever you have less than 9 digits after the decimal, it will be filled with zeroes.

It really depends; if you really want to be exact, a C string ( char * ) would give you that. A struct of your own definition, using a long and storing a 'scaling factor', would probably let you do calculations better. And double gives you roughly 15 digits to work with, but computation with double may accumulate too much 'noise' into the low bits, wrecking data that might need to be exact.

The specifier for long double is "%Lf".

But even double or long double (which are the same on many platforms) won't give you exact the same representation, because the binary representation of 12.123456789 is an infinite fraction. If you are restricted to pure C you can use fixed point based on a 64 bit int type.

您应该将scanf说明符%lfdouble一起使用,或者将scanf说明符%llflong double一起使用。

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM