简体   繁体   中英

What is the worst case scenario for an unordered_map?

I have found many posts about the complexity of map and unordered_map . It is said that unordered_map has worst case complexity of O(N) . For my purpose I will have input as sorted values like 1 2 5 6 9 11 12.. . I need to insert or find and delete a value. I will have to do insert/delete quite frequently. I thought of using set which has log(n) complexity in all cases. And then I stumbled upon unordered_map which has best O(1) complexity. But I need to understand in my scenario will I face the worst case scenario of unordered_map? And what will be the scenario?

EDIT: In my case all the values will be unique.

unordered_map worst case usually happens when the hash function is producing collisions for every insertion in the map.

I said "usually" because the standard only specifies the worst case complexity, not when or how it will happen, so theoretically the answer to your question is that it is implementation defined .

Since all your values are unique, and apparently integers (which present very good hashing, possibly optimal _ this is again implementation dependent), you will not run into this worst case scenario. insert/find/delete will be O(1), so it looks like a reasonable choice.

Depending on the implementation of the hashing algorithm, an ordered set of data might end up causing a lot of collisions when using an unordered_map. Since your data is ordered, it might be more advantageous to use a treeset.(Assuming you don't want the ability to add duplicate data.)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM