I know that in javascript a 1 evaluates as true , while a 0 evaluates as false .
My question is regarding the while loop in javascript, in the format as follows:
while (condition) {
code block to be executed
}
If I pass a single integer as a condition for the while loop, how does the code function? More specifically, here is the code I was working on before posting this question:
function chunkArrayInGroups(arr, size) {
var newArr = [];
while (arr.length) {
newArr.push(arr.splice(0,size));
}
return newArr;
}
chunkArrayInGroups([0, 1, 2, 3, 4, 5], 2);
and it seems to be working perfectly fine, returning the newArr the way I need it to by dividing the original array ( arr ) the amount of times required ( size ) and push ing that to the newArr.
How does the condition arr.length evaluate? My assumption is that it evaluates as true as long as it is not zero, ie a nonzero number, despite it not being a comparison such as i < 2.
This exercise came from freeCodeCamp: Chunky Monkey
0
evaluates to true
. 0
evaluates to false
. So, your code will run until arr.length
is equal to 0
- it is the same as saying while (arr.length!=0)
if you're evaluating array's length property, it will be true until array is not empty ( length is not 0 ). When the array is empty ( length is 0) evaluation will return false.
Yes, you are absolutely right, it evaluates as true whilst arr
has a length because arr.length
is > 0
so is considered true. It is a neat feature of javascript
however it is not that readable (as you have found) and can therefore be confusing to anyone not used to it. However if you are coding for yourself, this is a clever little function!
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.