简体   繁体   中英

ConcurrentLinkedQueue$Node instances grow and JVM throws OOM

I am using ConcurrentLinkedQueue in a Producer-Consumer scenario. My producer is a Singleton called by all methods in my application: Producer.getInstance().add("foo"); and the add() method calls the ConcurrentLinkedQueue offer method.

public void add(String message) {
    myQueue.offer(message);
}

Otherwise, I have my Consumer running in an other thread and simply calling the poll method on the ConcurrentLinkedQueue inside the Producer.

Edit:

Add code between if ((buffer = myQueue.poll()) != null) { }

CRActiveMQProducer is a Singleton which initializes a connection to my ActiveMQ server and send message with the send() method.

private StringBuffer stringBuffer = new StringBuffer();

public void run() {
    while(condition) {
        String buffer = null;
        if ((buffer = myQueue.poll()) != null) {
            stringBuffer.append(buffer);
            numberMessage++;
            if (numberMessage >= 10000) {
                CRActiveMQProducer.getInstance().send(stringBuffer.toString());
                stringBuffer = stringBuffer.delete(0, stringBuffer.length());
                numberMessage = 0L;
            }
        }
    }
} 

I call the Producer add() method like 50 Million times (yeah it's huge but it's just 2,5% of the number of called which should be done)

Anyway, I got a OutOfMemory Exception, I try to read the heap dump with VisualVM and I figure out that this OOM was caused by the huge number of ConcurrentLinkedQueue$Node instances (more than 30Millions). I think I have a new node for each offer() or poll() methods call, but not 100% sure (can't able to load the full heap dump...).

Do you think this is a normal behaviour of ConcurrentLinkedQueue ? Or just me doing something wrong ? Thanks!

The queue obvously has to store all the elements you put in the queue, and it does that using nodes which are linked with each other. That's the principle of a linked queue. Your producer produces itels too fast, and the consumer doesn't have the time to consume them, so in the end, you end up with an OOM.

You should consider using a bound BlockingQueue: it would force the producer thread to block when the queue contains too many elements, avoiding OOMs.

Couple suggestions:

  1. Add much more memory to your application. Try something like -Xmx2g
  2. Try adding additional threads to read from the queue
  3. Make the "do something with buffer" as fast as possible
  4. Spin off "do something with buffer" in to its own thread (probably using a threadpool) to keep it from blocking reading more from the queue

Have you timed everything within the reader service to see how long it takes? If you really are putting things on your queue faster than you can read them off then you need to find ways of speeding the reader(s) up. You running out of memory is just a side effect of the real problem. The problem is you aren't processing the queued up work fast enough.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM