简体   繁体   中英

Limit concurrency per API endpoint

How can I limit concurrency per API endpoint in .net core. I need one endpoint to be limited to single request in my microservice.

Though I am curious about the actual details supporting your particular use case for a singleton web service. I will answer your question.

The most elegant solution that comes to mind is the use of custom middle ware: Write custom ASP.NET Core middleware

The code for it might be something like this:

// Example middleware
using System;
using System.Net;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Caching.Memory;

namespace YourNamespaceHere
{
    public class MySingleRequestLimitingMiddleware
    {
        private readonly RequestDelegate _next;
        private readonly IMemoryCache _cache;
        private readonly string FlagCacheKey = "RequestAvailability";
        private readonly string _statusLocked = "locked";
        private readonly string _statusAvailable = "available";

        public MySingleRequestLimitingMiddleware(RequestDelegate next,
            IMemoryCache cache)
        {
            _next = next;
            _cache = cache;
        }

        public async Task Invoke(HttpContext httpContext)
        {
            string cacheEntry;

            // Set cache options.
            var cacheEntryOptions = new MemoryCacheEntryOptions()
                // Keep in cache for 2 minutes.
                .SetAbsoluteExpiration(DateTimeOffset.Now.AddMinutes(2));

            // Look for the cache key.
            if (!_cache.TryGetValue(FlagCacheKey, out cacheEntry))
            {
                // Key not in cache, so set lock.
                cacheEntry = _statusAvailable;

                // Save data in cache.
                _cache.Set(FlagCacheKey, cacheEntry, cacheEntryOptions);
            }

            if (cacheEntry.Equals(_statusLocked, StringComparison.OrdinalIgnoreCase))
            {
                httpContext.Response.StatusCode = (int)HttpStatusCode.TooManyRequests;
                httpContext.Response.ContentType = "text/plain";

                await httpContext.Response.WriteAsync("Maximum API calls admitted: 1.");

                return;
            }

            /* ***BEWARE: You are entering the request the next line
             * will lock up the call until this flag is changed.
             * You must set the flag in your controller or somewhere else every single
             * request no fail or your call will be left in an unusable state.
             * The only thing that fixes the state if you fail to change state
             * is restarting the service or waiting 2 minutes. */
            _cache.Set(FlagCacheKey, _statusLocked, cacheEntryOptions);

            await _next.Invoke(httpContext);
        }
    }
}

// In your Startup.cs
namespace YourNamespaceHere
{
    public class Startup
    {
        public void ConfigureServices(IServiceCollection services)
        {
            services.AddMemoryCache(); // Add this line to yours somewhere
        }

        public void Configure(IApplicationBuilder app)
        {
            // Put it at very top because you don't need to run anything
            // else if the limit of 1 request has already been reached.
            app.UseWhen(context => context.Request.Path.StartsWithSegments("/somepathtoyourcallhere"), appBuilder =>
            {
                appBuilder.UseMySingleRequestLimitingMiddleware();
            });

            // The rest of your code
        }
    }
}

// Example Controller
// In the controller containing your singleton request call
// ***NOTE you MUST set the flag even if there is an error!!!
public class SomeController : Controller
{
    private IMemoryCache _cache;

    public SomeController(IMemoryCache memoryCache)
    {
        _cache = memoryCache;
    }

    [HttpPost]
    public IActionResult YourSingleServiceRequestActionNameHere(SomeMode model)
    {
        // Some code here...
        _cache.Set(FlagCacheKey, "available", new MemoryCacheEntryOptions()
                .SetAbsoluteExpiration(DateTimeOffset.Now.AddMinutes(2)));
    }

    protected override void Dispose(bool disposing)
    {
        if (disposing)
        {
            // Just in case you didn't set that flag already lets make sure.
            _cache.Set(FlagCacheKey, "available", new MemoryCacheEntryOptions()
                .SetAbsoluteExpiration(DateTimeOffset.Now.AddMinutes(2)));
        }

        base.Dispose(disposing);
    }
}

You can also consider the use of IP or Client limit rating middleware such as: AspNetCoreRateLimit

Since we do not know when the request will complete we will just set the cache expiration for 2 minutes so the service becomes unusable for no longer than 2 minutes if the flag fails to be updated in your code. Does your call take 2 minutes to complete? If so then a web request may time out before the request even completes.

If it does take more than 2 minutes to complete consider if a web service call is the proper solution.

Hope that helps.

Happy coding!!!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM