Let's say I've got the following array of objects in JavaScript:
const requests = [
{
id: 1,
person: {
id: 1
}
},
{
id: 2,
person: {
id: 1
}
},
{
id: 3,
person: {
id: 2
}
},
{
id: 4,
person: {
id: 3
}
},
{
id: 5,
person: {
id: 2
}
}
]
And what I've written below will go over each item in the array, and then create a new array containing just the person
object.
const requestsPeopleIds = []
for (const request of requests) {
requestsPeopleIds.push(request.person.id)
}
I then take that new array and create another new array using Set
to remove the duplicate id
s:
const uniquePeopleIds = Array.from(new Set(requestsPeopleIds))
The final result is as I'd expect:
console.log(uniquePeopleIds) // [1, 2, 3]
where these are the unique id
s of the people who made a request. So out of the 5
requests, these were made by 3
people.
There must be a more efficient way of doing this, so I'm reaching out to you stack overflow
JS gurus.
Thanks in advance.
I think you got the basics. Here's a way to tighten the code:
var ids = new Set;
requests.forEach(i => ids.add(i.person.id));
You could also do this with map
method and spread syntax ...
.
const requests = [{"id":1,"person":{"id":1}},{"id":2,"person":{"id":1}},{"id":3,"person":{"id":2}},{"id":4,"person":{"id":3}},{"id":5,"person":{"id":2}}] const result = [...new Set(requests.map(({ person: { id }}) => id))] console.log(result)
You can do it by making an object by the person's id as a key and get the keys of the object.
const requests = [{"id":1,"person":{"id":1}},{"id":2,"person":{"id":1}},{"id":3,"person":{"id":2}},{"id":4,"person":{"id":3}},{"id":5,"person":{"id":2}}] // Take an empty object const uniques = {}; // Iterate through the requests array and make person's id as a // key of the object and put any value at this index (here I put 1). requests.forEach(request => (uniques[request.person.id] = 1)); // Finally get the keys of the unique object. console.log(Object.keys(uniques));
I've done some research and have inferred some interesting facts:
It looks like when we have very various data and larger array, then Set
collection shows not best results. Set
is very optimized collection, however, in my view, it should always check whether element is already added into Set
. And this checking will take O(n)
complexity. But we can use simple JavaScript object
. Checking whether object
contains key is O(1). So object
will have huge advantage over Set
.
foreach
arrow function is very convenient, however, simple for
loop is faster.
Adding console.log
makes Set
the most fastest solution, however, without console.log
, the most fastest solution is combination of for
loop and object
.
So the most performant code without console.log()
looks like this:
const hashMap = {};
const uniques = [];
for (let index = 0; index < requests.length; index++) {
if (!hashMap.hasOwnProperty(requests[index].person.id)){
hashMap[requests[index].person.id] = 1;
uniques.push(requests[index].person.id);
}
}
However, the most performant code with console.log()
looks like this(I cannot understand the reason why it happens. It would be really great to know why it happens):
var ids = new Set;
requests.forEach(i => ids.add(i.person.id));
console.log(ids)
Tests:
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.