I got for example a multidimensional array items with 2 dimensions. I get this array from a database, but it will fill up to 2600+ objects, but if i could some how unique this it would be around 30 objects. How to solve this?
The set up is: How i get the information:
$.getJSON(url1,function(data1)
{
for (var i in data1.layers){
$.each(data1.layers[i].legend, function( a, b ) {
for(var a in data1.layers[i].legend[a]){
$.each(data1.layers[i].legend, function( key, val ){
items.push({label: val.label, url: "long link" + val.url});
});
};
});
};
items[0].label
items[0].url
items[1].label
items[1].url
etc...
I found another stackoverflow page about this in php, but i can't get it to work in JavaScript/JQuery. Stackoverflow php solution
You're making a mistake in pushing them onto an array in the first place. You should be creating an associative array, with the key field the unique aspect.
Use a dictionary to see which item you have already:
var dict = {};
$.getJSON(url1,addToLocalArray);
function addToLocalArray(data1){
for (var i in data1.layers){
$.each(data1.layers[i].legend, function( a, b ) {
for(var a in data1.layers[i].legend[a]){
$.each(data1.layers[i].legend, function( key, val ){
if(!dict[val.url+'-'+val.label]){
// or create some sort of unique hash for each element...
dict[val.url+'-'+val.label] = 1;
items.push({label: val.label, url: "long link" + val.url});
}
});
}
});
}
}
I would suggest either:
A)Push the Uniqueness filtering logic off to the database. It's much more efficient at this, and won't require exhaustive iteration.
B)Have a key for each unique Item (As vogomatix suggested).
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.