简体   繁体   English

哈希表大小有限?

[英]Hashtable with limited size?

For managing some cache, I need a sort of Hashtable with the ability to remove the oldest elements in order to keep the MAXSIZE last elements in the table. 为了管理某些缓存,我需要一种能够删除最旧元素的哈希表,以便将MAXSIZE个最后元素保留在表中。 I need to program this in Java but any algorithm with pseudo code would be fine too. 我需要用Java编程,但是任何带有伪代码的算法也都可以。

public interface LimitedHashtable<K, V> {
    void put(K k, V v); // will remove the oldest element from the table if size > MAXSIZE
    V get(K k);
}

Any idea? 任何想法?

Take a look at LinkedHashMap with an overridden removeEldestEntry -- if it returns size() > MAXSIZE , you'll have what you want. 看看具有重写的removeEldestEntry LinkedHashMap -如果它返回size() > MAXSIZE ,您将拥有所需的内容。 One of LinkedHashMap 's constructors also has a boolean that says whether the "oldest" entry means the one that was added longest ago, or the one that was accessed longest ago. LinkedHashMap的构造函数之一还具有一个布尔值,该布尔值表示“最旧”条目是指添加时间最久的条目,还是表示访问时间最久的条目。

我知道的唯一方法是在哈希表中保留键队列,并在表和队列太大时使用它来排出键

Consider using Guava's CacheBuilder , which offers a maximumSize() amongst other options. 考虑使用Guava的CacheBuilder ,它在其他选项中提供了maximumSize() Note that this behavior is more nuanced than what you might be looking for: 请注意,此行为比您要寻找的行为更加细微:

Specifies the maximum number of entries the cache may contain. 指定高速缓存可以包含的最大条目数。 Note that the cache may evict an entry before this limit is exceeded. 请注意,在超出此限制之前,高速缓存可能会逐出一个条目。 As the cache size grows close to the maximum, the cache evicts entries that are less likely to be used again. 随着高速缓存大小逐渐接近最大值,高速缓存逐出了不太可能再次使用的条目。 For example, the cache may evict an entry because it hasn't been used recently or very often. 例如,高速缓存可能会逐出一个条目,因为该条目最近或被很少使用。

You want a LinkedHashMap . 您需要一个LinkedHashMap You'll have to add some hooks to remove entries when that map gets too big. 当地图太大时,您必须添加一些钩子才能删除条目。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM