简体   繁体   中英

Simple way to rate limit HttpClient requests

I am using the HTTPClient in System.Net.Http to make requests against an API. The API is limited to 10 requests per second.

My code is roughly like so:

    List<Task> tasks = new List<Task>();
    items..Select(i => tasks.Add(ProcessItem(i));

    try
    {
        await Task.WhenAll(taskList.ToArray());
    }
    catch (Exception ex)
    {
    }

The ProcessItem method does a few things but always calls the API using the following: await SendRequestAsync(..blah) . Which looks like:

private async Task<Response> SendRequestAsync(HttpRequestMessage request, CancellationToken token)
{    
    token.ThrowIfCancellationRequested();
    var response = await HttpClient
        .SendAsync(request: request, cancellationToken: token).ConfigureAwait(continueOnCapturedContext: false);

    token.ThrowIfCancellationRequested();
    return await Response.BuildResponse(response);
}

Originally the code worked fine but when I started using Task.WhenAll I started getting 'Rate Limit Exceeded' messages from the API. How can I limit the rate at which requests are made?

Its worth noting that ProcessItem can make between 1-4 API calls depending on the item.

The API is limited to 10 requests per second.

Then just have your code do a batch of 10 requests, ensuring they take at least one second:

Items[] items = ...;

int index = 0;
while (index < items.Length)
{
  var timer = Task.Delay(TimeSpan.FromSeconds(1.2)); // ".2" to make sure
  var tasks = items.Skip(index).Take(10).Select(i => ProcessItemsAsync(i));
  var tasksAndTimer = tasks.Concat(new[] { timer });
  await Task.WhenAll(tasksAndTimer);
  index += 10;
}

Update

My ProcessItems method makes 1-4 API calls depending on the item.

In this case, batching is not an appropriate solution. You need to limit an asynchronous method to a certain number , which implies a SemaphoreSlim . The tricky part is that you want to allow more calls over time .

I haven't tried this code, but the general idea I would go with is to have a periodic function that releases the semaphore up to 10 times. So, something like this:

private readonly SemaphoreSlim _semaphore = new SemaphoreSlim(10);

private async Task<Response> ThrottledSendRequestAsync(HttpRequestMessage request, CancellationToken token)
{
  await _semaphore.WaitAsync(token);
  return await SendRequestAsync(request, token);
}

private async Task PeriodicallyReleaseAsync(Task stop)
{
  while (true)
  {
    var timer = Task.Delay(TimeSpan.FromSeconds(1.2));

    if (await Task.WhenAny(timer, stop) == stop)
      return;

    // Release the semaphore at most 10 times.
    for (int i = 0; i != 10; ++i)
    {
      try
      {
        _semaphore.Release();
      }
      catch (SemaphoreFullException)
      {
        break;
      }
    }
  }
}

Usage:

// Start the periodic task, with a signal that we can use to stop it.
var stop = new TaskCompletionSource<object>();
var periodicTask = PeriodicallyReleaseAsync(stop.Task);

// Wait for all item processing.
await Task.WhenAll(taskList);

// Stop the periodic task.
stop.SetResult(null);
await periodicTask;

The answer is similar to this one .

Instead of using a list of tasks and WhenAll , use Parallel.ForEach and use ParallelOptions to limit the number of concurrent tasks to 10, and make sure each one takes at least 1 second:

Parallel.ForEach(
    items,
    new ParallelOptions { MaxDegreeOfParallelism = 10 },
    async item => {
      ProcessItems(item);
      await Task.Delay(1000);
    }
);

Or if you want to make sure each item takes as close to 1 second as possible:

Parallel.ForEach(
    searches,
    new ParallelOptions { MaxDegreeOfParallelism = 10 },
    async item => {
        var watch = new Stopwatch();
        watch.Start();
        ProcessItems(item);
        watch.Stop();
        if (watch.ElapsedMilliseconds < 1000) await Task.Delay((int)(1000 - watch.ElapsedMilliseconds));
    }
);

Or:

Parallel.ForEach(
    searches,
    new ParallelOptions { MaxDegreeOfParallelism = 10 },
    async item => {
        await Task.WhenAll(
                Task.Delay(1000),
                Task.Run(() => { ProcessItems(item); })
            );
    }
);

UPDATED ANSWER

My ProcessItems method makes 1-4 API calls depending on the item. So with a batch size of 10 I still exceed the rate limit.

You need to implement a rolling window in SendRequestAsync. A queue containing timestamps of each request is a suitable data structure. You dequeue entries with a timestamp older than 10 seconds. As it so happens, there is an implementation as an answer to a similar question on SO.

ORIGINAL ANSWER

May still be useful to others

One straightforward way to handle this is to batch your requests in groups of 10, run those concurrently, and then wait until a total of 10 seconds has elapsed (if it hasn't already). This will bring you in right at the rate limit if the batch of requests can complete in 10 seconds, but is less than optimal if the batch of requests takes longer. Have a look at the .Batch() extension method in MoreLinq . Code would look approximately like

foreach (var taskList in tasks.Batch(10))
{
    Stopwatch sw = Stopwatch.StartNew(); // From System.Diagnostics
    await Task.WhenAll(taskList.ToArray());
    if (sw.Elapsed.TotalSeconds < 10.0) 
    {
        // Calculate how long you still have to wait and sleep that long
        // You might want to wait 10.5 or 11 seconds just in case the rate
        // limiting on the other side isn't perfectly implemented
    }
}

https://github.com/thomhurst/EnumerableAsyncProcessor

I've written a library to help with this sort of logic.

Usage would be:

var responses = await AsyncProcessorBuilder.WithItems(items) // Or Extension Method: items.ToAsyncProcessorBuilder()
        .SelectAsync(item => ProcessItem(item), CancellationToken.None)
        .ProcessInParallel(levelOfParallelism: 10, TimeSpan.FromSeconds(1));

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM