简体   繁体   中英

How to clean ConcurrentHashMap in thread safe way

I have a ConcurrentHashMap oricginalCahce in which mulptile threads are writing concurrently from many places of the code. I have to flush the map periodically so I do below I create a temporary copy of Map and clean the origianl Map

void flushcopyOfOriginalCache(String queueName) {
        Map<Integer, Date> copyOfOriginalCache = null;
        Object lock = getLastscheduledTimeUpdateLock(queueName);
        //this lock is not aquired at the time of writing as the writing in taking place in many places of the code
        synchronized (lock) {
        //rechecking for whether expired or not
            if (isOrigCacheExpired(queueName)) {
                log.info("origCache Expired for queue {}", queueName);
                if (orignalCache.containsKey(queueName)) {
                    copyOfOriginalCache = new ConcurrentHashMap<>(
                            orignalCache.get(queueName));
                    //How safe is below clean ??? as parallel writes might be going on orignalCache!
                    orignalCache.get(queueName).clear();
                }

            }
        }

        if (copyOfOriginalCache == null || copyOfOriginalCache.isEmpty()) {
            return;
        }
        //Below is expenssive DB operations on with data in copyOfOriginalCache  

        }

How to make sure that this clean method should not delete entries which is par alley being written. Please guide.

For a ConcurrentHashMap you can use computeIfPresent, with a predicate canDeleted(Value v)

concurrentHashmap.forEach((key, value) -> 
    concurrentHashmap.computeIfPresent(key, (t, s) -> (canDeleted(s) ? null : s)))

this is threadsafe and easy to implement.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM