I'm new to Java and really need your help.
I am presently using a queue, a receiver thread puts data into this queue and the parser reads out of this. But the problem is the receiver may receive at incredible peak speed, eg. 3000/sec, while the parser only parses at 100/sec.
EDIT:I have checked, the queue first stays at 100 or so, and after ten seconds it starts to grow at 100 per second, and crashes at 2000 or so. Could it be possible that there is a memory leak?
My code (in a tight loop) is
byte[] data = new byte[1024];
System.arraycopy(udpPacket.getData(), 0, data, 0, 1024);
queue.offer(data);
The heap is filled up too quickly, and I get an outofmemory exception. I guess the problem is that queue is made using a linked-list, and all the pointers must be saved in the heap.
I know a C version that does the same thing(using a buffer) but has much better performance, but because of deployment issues, we can only use Java.
If you receive 3000/sec but only process 100/sec sooner or later you will run out of memory. May I suggest you use more threads to do the parsing?
Concerning the queue, have a look at LinkedBlockingDeque and LinkedBlockingQueue . There are both high-performance thread-safe queue implementations.
Since data comes in 30 times faster than it is processed you may extend HeapSize using
java -Xms<initial heap size> -Xmx<maximum heap size>
if the transmission is finished before your memory is exhausted.
If the producer produces more data then the consumer can handle, then the data will start to accumulate and eventually you run into OutOfMemory problems. This will depend on (1) the rate difference between the producer and consumer, (2) the quantity of data you have to process.
I suggest you limit the number of items in the queue. Use a BlockingDeque
-> LinkedBlockingDeque
to limit the capacity of the queue and block your loop when the limit is reached. This way, the queue acts as a cache to the parser.
I guess the problem is that queue is made using a linked-list, and all the pointers must be saved in the heap.
I don't think so. I think that the real problem is the mismatch between the rate at which your system gets input and the rate that it can process it. Unless you can process at at least the average input rate, you will eventually run out of memory, no matter how you represent the queue.
You must either improve the processing rate, reduce the input rate or ... drop data.
另一种方法是在队列变得太大时对数据进行采样,并保存采样率,以便我们可以模拟原始数据。
When you run java
, you can use the -Xmx
parameter to make more memory available to the virtual machine. For example, java -Xmx512m
will allow the VM to allocate up to 512Mb of memory. (The default is fairly small).
But if you're allocating memory and filling up a list with data and never removing it, eventually you're going to run out of memory, no matter which language you're using.
I have a theory, the implementation of ArrayDeque
(at least in Oracle JDK, I'm not sure about Android's) seems to never actually deallocate popped elements.
In other words, slots for popped elements are simply nulled. While elements added to the tail will make the "inner array" grow bigger for eternity, which will cause trouble sooner or later.
This code from Oracle JDK 1.8.0_144 :
public E pollFirst() {
int h = head;
@SuppressWarnings("unchecked")
E result = (E) elements[h];
// Element is null if deque empty
if (result == null)
return null;
elements[h] = null; // Must null out slot
head = (h + 1) & (elements.length - 1);
return result;
}
Spells trouble for me. :(
If my analysis is true, it seems that ArrayDeque
is never intended to be a "real" FIFO queue, and is not suitable for such purpose. (Unfortunately I'm currently needing such purpose right now)
I'm currently investigating LinkedList
instead (which also implements Deque
).
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.