简体   繁体   中英

Asynchronous Queue Manager

I've run into a problem while writing an async multi-server network app in c#. I have many jobs being taken care of by the thread pool and these include the writes to the network sockets. This ended up allowing for the case where more than one thread could write to the socket at the same time and discombobulate my outgoing messages. My idea for getting around this was to implement a queue system where whenever data got added to the queue, the socket would write it.

My problem is, I can't quite wrap my head around the architecture of something of this nature. I imagine having a queue object that fires an event on whenever data gets added to the queue. The event then writes the data being held in the queue, but that won't work because if two threads come by and add to the queue simultaneously, even if the queue is made to be thread safe, events will still be fired for both and I'll run into the same problem. So then maybe someway to hold off an event if another is in progress, but then how do I continue that event once the first finishes without simply blocking the thread on some mutex or something. This wouldn't be so hard if I wasn't trying to stay strict with my "block nothing" architecture but this particular application requires that I allow the thread pool threads to keep doing their thing.

Any ideas?

Sounds like you need one thread writing to the socket synchronously and a bunch of threads writing to a queue for that thread to process.

You can use a blocking collection ( BlockingCollection<T> ) to do the hard work:

// somewhere there is a queue:

BlockingCollection<byte[]> queue = new BlockingCollection<byte[]>();

// in socket-writing thread, read from the queue and send the messages:

foreach (byte[] message in queue.GetConsumingEnumerable())
{
    // just an example... obviously you'd need error handling and stuff here
    socket.Send(message);
}

// in the other threads, just enqueue messages to be sent:

queue.Add(someMessage);

The BlockingCollection will handle all synchronization. You can also enforce a maximum queue length and other fun things.

While similar to Porges answer it differs a bit in implementation.

First, I usually don't queue the bytes to send, but objects and seralize them in the sending thread but I guess that's a matter of taste. But the bigger difference is in the use of ConcurrentQueues (in addition to the BlockingCollection). So I'd end up with code similar to

        BlockingCollection<Packet> sendQueue = new BlockingCollection<Packet>(new ConcurrentQueue<Packet>());
        while (true)
        {
            var packet = sendQueue.Take(); //this blocks if there are no items in the queue.
            SendPacket(packet); //Send your packet here.
        }

The key-take away here is that you have one thread which loops this code, and all other threads can add to the queue in a thread-safe way (both, BlockingCollection and ConcurrentQueue are thread-safe)

have a look at Processing a queue of items asynchronously in C# where I answered a similar question.

I don't know C#, but what I would do is have the event trigger the socket manager to start pulling from the queue and write things out one at a time. If it is already going the trigger won't do anything, and once there is nothing in the queue, it stops.

This solves the problem of two threads writing to the queue simultaneously because the second event would be a no-op.

You could have a thread-safe queue that all your worker thread write their results to. Then have another thread that polls the queue and sends results when it sees them waiting.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM