简体   繁体   中英

`[1] == [1]` returns “false” and `[1] == 1` returns “true”?

I found this strange behavior in javascript .

var v1 = [1];
var v2 = [1];

v1 == v2  // false

v1 == 1 //true

[1] == [1] // false

1 == [1]    // true

why is it that [1] == [1] returns false and [1] == 1 returns true ?

The spec says that if the two operands of == have the same type as each other (such as in the [1] == [1] case, where they both are type Object ), then == behaves exactly like === . The two arrays are not the exact same object, so false is returned. Notice that:

var v1 = [1];
var v2 = v1;

v1 == v2; // true

When the operands have different types, they are both coerced. In the case of 1 == [1] Rule 10 from the link above applies first and the array is converted to a primitive, by its toString() which returns '1' . Then rule 6 applies (converting the string '1' to the number 1 ), and the comparison becomes 1 == 1 , and finally they have the same type and are compared with === . Obviously 1 === 1 evaluates to true.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM