简体   繁体   中英

HashMap iteration complexity

From java doc I know that:

Iteration over collection views requires time proportional to the "capacity" of the HashMap instance (the number of buckets) plus its size (the number of key-value mappings). Thus, it's very important not to set the initial capacity too high (or the load factor too low) if iteration performance is important.

Does it mean the time complexity for iteration over HashMap is O (n²)? This question may sound silly but I'm actually a bit confused.

No, this does not mean that the iteration complexity is O(n 2 ).

When the capacity c is set automatically, it grows as O(n) with the number of items, not as O(n 2 ) of the items. According to source code, target capacity is computed as follows:

int targetCapacity = (int)(numKeysToBeAdded / loadFactor + 1);

where loadFactor is a float value set to 0.75f by default.

The paragraph that you quote becomes relevant only when you set the capacity manually. It is telling you that the performance is O(c), not O(n), where c is the capacity that you set or the capacity computed automatically.

No, it means that the complexity for iteration over a HashMap is O(n + s), where n is the number of mappings and s is the size. It must iterate linearly over all s buckets and linearly over all n entries.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM