简体   繁体   中英

why does java and c# differ in simple Addition

I have two snippets, one in Java and one in c# .

float a = 1234e-3f;
float b = 1.23f;
float ca = 1.234e3f;
float d = 43.21f;
long e = 1234L;
int f = 0xa;
int g = 014;
char h = 'Z';
char ia = ' ';


byte j = 123;
short k = 4321;

System.out.println(a+b+ca+d+e+f+g+h+ia+j+k);

the Java snippet returns 7101.674

and in c#

float a = 1234e-3f;
float b = 1.23f;
float ca = 1.234e3f;
float d = 43.21f;
long e = 1234L;
int f = 0xa;
int g = 014;
char h = 'Z';
char ia = ' ';


byte j = 123;
short k = 4321;

Console.WriteLine(a+b+ca+d+e+f+g+h+ia+j+k);

produces a result of 7103.674 .

why am I off by 2 and what is correct?

The difference is in the

int g = 014;

It's Octal in case of Java ( 014 == 12 ) and Decimal in case of C# ( 014 == 14 ).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM