简体   繁体   中英

How to prevent known and unknown bots c#

I have a simple web application, which is hosted and I have enable google search so google bots crawls, i cant find many unknown bots crawls in my application. I need to know the valid users visiting the website(except bots).

I have used

httprequest.Browser.Crawler

But it doesn't works properly.

Can anyone please help me to prevent this fully?

You can use Request.UserAgent to see the UserAgent og requests, and then match that to a list of Crawler UserAgent strings

I would properly use a filter for this and then apply it in your filterconfig

public class CrawlerFilter : ActionFilterAttribute
{
    public override void OnActionExecuting(ActionExecutingContext context)
    {
        var userAgent = context.HttpContext.Request.UserAgent;
        //Do something with the userAgent and/or drop the request
    }
}

The problem with this is that not all crawler abide by this, so you can't really be "super" certain.

Edit

i just learned that Request.Browser.Crawler is basically what I've suggested above, although the list (from Browser.Crawler) seems to not be maintained very well.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM