简体   繁体   中英

Dispatcher, Async/Await, Concurrent work

I have bunch of async methods, which I invoke from Dispatcher. The methods does not perform any work in the background, they just waits for some I/O operations, or wait for response from the webserver.

async Task FetchAsync()
{
    // Prepare request in UI thread
    var response = await new WebClient().DownloadDataTaskAsync();
    // Process response in UI thread
}

now, I want to perform load tests, by calling multiple FetchAsync() in parallel with some max degree of parallelism.

My first attempt was using Paralell.Foreach() , but id does not work well with async/await.

var option = new ParallelOptions {MaxDegreeOfParallelism = 10};
Parallel.ForEach(UnitsOfWork, uow => uow.FetchAsync().Wait());

I've been looking at reactive extensions, but I'm still not able to take advantage of Dispatcher and async/await.

My goal is to not create separate thread for each FetchAsync() . Can you give me some hints how to do it?

Just call Fetchasync without awaiting each call and then use Task.WhenAll to await all of them together.

var tasks = new List<Task>();
var max = 10;
for(int i = 0; i < max; i++)
{
    tasks.Add(FetchAsync());
}

await Task.WhenAll(tasks);

Here is a generic reusable solution to your question that you can reuse not only with your FetchAsync method but for any async method that has the same signature. The api includes real time concurrent throttling support as well:

Parameters are self explanatory: totalRequestCount: is how many async requests (FatchAsync calls) you want to do in total, async processor is the FetchAsync method itself, maxDegreeOfParallelism is the optional nullable parameter. If you want real time concurrent throttling with max number of concurrent async requests, set it, otherwise not.

public static Task ForEachAsync(
        int totalRequestCount,
        Func<Task> asyncProcessor,
        int? maxDegreeOfParallelism = null)
    {
        IEnumerable<Task> tasks;

        if (maxDegreeOfParallelism != null)
        {
            SemaphoreSlim throttler = new SemaphoreSlim(maxDegreeOfParallelism.Value, maxDegreeOfParallelism.Value);

            tasks = Enumerable.Range(0, totalRequestCount).Select(async requestNumber =>
            {
                await throttler.WaitAsync();
                try
                {
                    await asyncProcessor().ConfigureAwait(false);
                }
                finally
                {
                    throttler.Release();
                }
            });
        }
        else
        {
            tasks = Enumerable.Range(0, totalRequestCount).Select(requestNumber => asyncProcessor());
        }

        return Task.WhenAll(tasks);
    }

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM