简体   繁体   中英

Qt: Can emitting signals cause an Stack Overflow (or memory leak)? What happens if their connected slot/thread is blocked?

When the target thread which is going to capture the signal is blocked, what will happen to the signal and the memory it occupies?

Do the signals go inside a queue?
Does the queue overflow and we lose some signals?
Do we get an stack overflow?

When a signal is emitted and it is connected by a queued connection to some objects then an event gets allocated on the heap per connection and gets posted to the event queue associated with the receiving object.

Once the event is processed it gets deleted. While the event is not processed it takes some heap space so if you keep posting events to the queue but it doesn't get drained on the other side you keep increasing the memory consumption w/o decreasing it. You can count it as a memory leak. The size of the queue is not specified but most likely you can't overflow it w/o causing an undefined behavior (or a defined exception such as std::bad_alloc ).

In general it may happen that signals are produced faster than they are consumed. This can hapen only if you use queued connections. This happens typically in multithreaded code (uses queued connection by default) or if you set your connection with flag Qt::QueuedConnection .

If your connection is not queued, then this situation does not happen because signal is processed by the slot synchronously, immediately after it is emitted. So unprocessed signals do not wait in the queue.

So when you have a queued connection and generate and emit signals faster than the consuming event loop can process them, they of course are enqueued, they occupy memory (heap) and if running long enough, the memory can be eventually exhaused (you would probably observe RAM swapping to disk, slowing down your system making it unusable). As you were asking about memory leaks - that would probably not happen. But memory leaks are your least concern here.

So you must avoid this situation of generating signals too fast. There are many options how to do it. For example you can have a timer in the emitting party which does not allow emitting signal if the latest signal was emitted before less than, say, 100 ms. (I am using this in my progressbars in my app.)

Another option is to implement a two-way communication, where the emitter will send a signal and the receiver will process it and will emit back another signal as a response confirming that the processing was done and this signal will be received by the emitter of the original signal, informing it that now it is safe to emit another signal.

Yet another option is to not use signals and slots and call methods directly, but of course you need to have proper synchronization mechanism in place using atomics or locking mutextes. Note that in this case the signals will not wait in a queue but threads can perform badly because they block each other too often.

So it is up to you which method you choose. But you must definitely avoid the situation when you are emitting signals faster than you are able to process them in a slot connected with queued connection.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM