简体   繁体   中英

How well do Script# numbers map to Javascript?

I've been playing with Script# , and I was wondering how the C# numbers were converted to Javascript. I wrote this little bit of code

int a = 3 / 2;

and looked at the relevant bit of compiled Javascript:

var $0=3/2;

In C#, the result of 3 / 2 assigned to an int is 1 , but in Javascript, which only has one number type, is 1.5 .

Because of this disparity between the C# and Javascript behaviour, and since the compiled code doesn't seem to compensate for it, should I assume that my numeric calculations written in C# might behave incorrectly when compiled to Javascript?

Should I assume that my numeric calculations written in C# might behave incorrectly when compiled to Javascript?

Yes.

Like you said, "the compiled code doesn't seem to compensate for it" - though for the case you mention where a was declared as an int it would be easy enough to compensate by using var $0 = Math.floor(3/2); . But if you don't control how the "compiler" works you're in a pickle. (You could correct the JavaScript manually, but you'd have to do that every time you regenerated it. Yuck.)

Note also that you are likely to have problems with decimal numbers too due to the way JavaScript represents decimal places. Most people are surprised the first time they find out that JavaScript will tell you that 0.4 * 3 works out to be 1.2000000000000002 . For more details see one of the many other questions on this issue, eg, How to deal with floating point number precision in JavaScript? . (Actually I think C# handles decimals the same way, so maybe this issue won't be such a surprise. Still, it can be a trap for new players...)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM