简体   繁体   中英

Optimizing interaction between Producer - Consumer threads

I have an implementation of Producer-Consumer multi-threaded interaction.It works.But I feel that during the execution wait states happen too often between Consumer thread and Producer.In my case Consumer accesses a queue at random intervals and takes data from it.Now,the producer thread runs during the whole process life time.The producer thread works as caching machine.It checks in a loop if the size of queue is smaller than the maximum cache size allowed and if that's the case it keeps pushing new data into that cache.My concern is that when the consumer tries to access the queue the last is still locked by the producer thread and the consumer should wait.Ideally,I would like to have the consumer causing the producer 'freeze' and unlock immediately while the consumer retrieves data from the queue.

Here is how I am doing it now:

   //Called by consumer(main thead) to retrieve data from cache
   uint8_t* Worker::GetFrame() {

      boost::unique_lock<boost::mutex> lk(_frameCacheMutex);
      //If the cache is empty:
      while (0 == _frames_cache.size()) {
               //tell Producer to push data into cache
             _fill_cache_wait_cond.notify_one();
              //wait for the data to arrive(will be signaled by worker thread)
             _get_frame_wait_cond.wait(lk);

      }
      uint8_t * fr = _frames_cache.front();
      _frames_cache.pop();
      // notify worker thread to continue caching
      _fill_cache_wait_cond.notify_one();

      return fr;
   }

Producer thread:

    void Worker::operator () () {
      //Some init here..
        while (isRunning) {
           boost::unique_lock<boost::mutex> lk(_frameCacheMutex);

           /// Here create the data for cache...

           _frames_cache.push(data);

           /// Notify waiting main thread to take data
           _get_frame_wait_cond.notify_one();

           /// If the cache is full ,wait
           while (_frames_cache.size() == cacheMaxSize ){

                 _fill_cache_wait_cond.wait(lk);

           }

        }     

    }

At this moment, the producer locks the queue until it is full. Only at that time, the consumer can access the queue, but it signals immediately that the queue is not full anymore, so the producer locks the queue again.

At least only lock after the data is ready to be pushed and try to limit the push actions.
If it's possible you can also increase the prioriy of the consumers.

By the way, this will not solve your immediate problem, but you can limit the number of notify_one() calls, because you only have to send one when the condition actually changes (not zero anymore to the consumer, not full anymore to the producer).

std::atomic<bool> producerShouldSleep = false;
std::atomic<bool> producerIsSleeping = false;

Item consumeItem(){ //consumer
    producerShouldSleep = true;
    while (!producerIsSleeping)
        yield();
    auto item = getItemFromQueue();
    producerShouldSleep = false;
    return item;
}

void produceData(){ //producer
    while (shouldBeRunning){
        if (producerShouldSleep){
            producerIsSleeping = true;
            while (producerShouldSleep)
                yield();
            producerIsSleeping = false;
            continue;
        }
        if (!queueIsFull())
            pushItemIntoQueue();
    }
}

This will give priority to the consumer. Unless I screwed something up it will be correctly synchronized. The only problem I see is that the queue could be empty and someone calls consumeItem in a tight loop which may block the producer from pushing an item into the queue which ultimately causes a lifelock.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM