简体   繁体   中英

HashMap key update vs double entries

I am using a hashmap to store objects with a key that evolves over time.

HashMap<String,Stuff> hm = new HashMap<String,Stuff>()
Stuff stuff = new Stuff();
hm.put( "OrignalKey", stuff);

I didn't find anything better than removing "OrignalKey" and put() a new entry with the same object.

hm.remove("OriginalKey");
hm.put("NewKey", stuff);

remove() seems to be taking a significant cpu toll hence my questions:

  1. What is the actual the memory cost to leave duplicate entries (there is no overlapping risk)?
  2. Am I just missing some neat swapKey() method?

What is the actual the memory cost to leave duplicate entries (there is no overlapping risk)?

Well, you've got an extra entry, and the key itself can't be garbage collected. If the key is "large", that could be a problem. It also means that you'll never be able to get an accurate count, you'll never be able to sensibly iterate over all the values, etc. It seems like a bad idea to me.

Am I just missing some neat swapKey() method?

There's no such thing - and it feels like a fairly rare requirement to me. Any such method would pretty much have to do what you're doing anyway - it has to find the old key, remove it from the data structure, and insert an entry for the new key. I can't easily imagine any optimizations possible just by knowing about both operations at once.

swapping of the key is not easily possible, since the key is used for hashing. changing the key means that the hashvalue is most probably different, too. in this case, changing the key conforms to deletion and following reinsertion

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM