简体   繁体   中英

Javascript: Is This Truly Signed Integer Division

Given the following code, where both a and b are Number s representing values within the range of signed 32-bit signed integers:

var quotient = ((a|0) / (b|0))|0;

and assuming that the runtime is in full compliance with the ECMAScript 6 specifications, will the value of quotient always be the correct signed integer division of a and b as integers? In other words, is this a proper method to achieve true signed integer division in JavaScript that is equivalent to the machine instruction?

I'm no expert on floating-point numbers, but Wikipedia says that doubles have 52 bits of precision. Logically, it seems that 52 bits should be enough to reliably approximate integer division of 32-bit integers.

Dividing the minimum and maximum 32-bit signed ints, -2147483648 / 2147483647 , produces -1.0000000004656613 , which is still a reasonable amount of significant digits. The same goes for its inverse, 2147483647 / -2147483648 , which produces -0.9999999995343387 .

An exception is division by zero , which I mentioned in a comment. As the linked SO question states, integer division by zero normally throws some sort of error, whereas floating-point coercion results in (1 / 0) | 0 == 0 (1 / 0) | 0 == 0 .

Update: According to another SO answer , integer division in C truncates towards zero, which is what |0 does in JavaScript. In addition, division by 0 is undefined, so JavaScript is technically not incorrect in returning zero. Unless I've missed anything else, the answer to the original question should be yes.

Update 2: Relevant sections of the ECMAScript 6 spec: how to divide numbers and how to convert to a 32-bit signed integer , which is what |0 does .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM