简体   繁体   中英

C# Multi-Threading - Limiting the amount of concurrent threads

I have question on controlling the amount of concurrent threads I want running. Let me explain with what I currently do: For example

 var myItems = getItems(); // is just some generic list

 // cycle through the mails, picking 10 at a time
 int index = 0;
 int itemsToTake = myItems.Count >= 10 ? 10 : myItems.Count;
 while (index < myItems.Count)
 {
     var itemRange = myItems.GetRange(index, itemsToTake);
     AutoResetEvent[] handles = new AutoResetEvent[itemsToTake];

     for (int i = 0; i < itemRange.Count; i++)
     {
         var item = itemRange[i];
         handles[i] = new AutoResetEvent(false);

         // set up the thread
         ThreadPool.QueueUserWorkItem(processItems, new Item_Thread(handles[i], item));
     }

    // wait for all the threads to finish
    WaitHandle.WaitAll(handles);

    // update the index
    index += itemsToTake;
    // make sure that the next batch of items to get is within range
    itemsToTake = (itemsToTake + index < myItems.Count) ? itemsToTake : myItems.Count -index;

This is a path that I currently take. However I do not like it at all. I know I can 'manage' the thread pool itself, but I have heard it is not advisable to do so. So what is the alternative? The semaphore class?

Thanks.

Instead of using ThreadPool directly, you might also consider using TPL or PLINQ. For example, with PLINQ you could do something like this:

getItems().AsParallel()
          .WithDegreeOfParallelism(numberOfThreadsYouWant)
          .ForAll(item => process(item));

or using Parallel :

var options = new ParallelOptions {MaxDegreeOfParallelism = numberOfThreadsYouWant};
Parallel.ForEach(getItems, options, item => process(item));

Make sure that specifying the degree of parallelism does actually improve performance of your application. TPL and PLINQ use ThreadPool by default, which does a very good job of managing the number of threads that are running. In .NET 4, ThreadPool implements algorithms that add more processing threads only if that improves performance.

Don't use THE treadpool, get another one (just look for google, there are half a dozen implementations out) and manage that yourself.

Managing THE treadpool is not advisable as a lot of internal workings may go ther, managing your OWN threadpool instance is totally ok.

It looks like you can control the maximum number of threads using ThreadPool.SetMaxThreads , although I haven't tested this.

Assuming the question is; "How do I limit the number of worker threads?" The the answer would be use a producer-consumer queue where you control the number of worker threads. Just queue your items and let it handle workers.

Here is a generic implementation you could use.

In the documentation , there is a mention of SetMaxThreads ...

public static bool SetMaxThreads (
int workerThreads,
int completionPortThreads
)

Sets the number of requests to the thread pool that can be active concurrently. All requests above that number remain queued until thread pool threads become available.

However:

You cannot set the number of worker threads or the number of I/O completion threads to a number smaller than the number of processors in the computer.

But I guess you are anyways better served by using a non-singleton thread pool.

There is no reason to deal with hybrid thread synchronization constructs (such is AutoResetEvent) and the ThreadPool.

You can use a class that can act as the coordinator responsible for executing all of your code asynchronously.

Wrap using a Task or the APM pattern what the "Item_Thread" does. Then use the AsyncCoordinator class by Jeffrey Richter (can be found at the code from the book CLR via C# 3rd Edition).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM