简体   繁体   中英

Preventing spammy XMLHTTP requests to php

I have a site that sends XMLHTTPRequests to a php file that handles the HTTP POST Request and returns data in JSON format. The urls for the post_requests files are public information (since a user can just view the JS code for a page and find the URLs I'm sending HTTP requests to)

I mainly handle HTTP Post Requests in PHP by doing this:

//First verify XMLHTTPRequest, then get the post data
if (isset($_SERVER['HTTP_X_REQUESTED_WITH']) && strtolower($_SERVER['HTTP_X_REQUESTED_WITH']) === 'xmlhttprequest')
{
  $request = file_get_contents('php://input');
  $data = json_decode($request);
  //Do stuff with the data
}

Unfortunately, I'm fairly sure that the headers can be spoofed and some devious user or click bot can just spam my post requests, repeatedly querying my database until either my site goes down or they go down fighting.

I'm not sure if their requests will play a HUGE role in the freezing the server with their requests (as 20 requests per second isn't that much). Should I be doing something about this? (especially in the case of a DDOS attack). I've heard of rate-limiting where you record an instance of every time some IP requests data and then trace if they are spammy in nature:

INSERT INTO logs (ip_address, page, date) values ('$ip', '$page', NOW())
//And then every time someone loads the php post request, check to see if they loaded the same one in the past second or 10 seconds

But that means every time there's a request by a normal user, I have to expend resources to log them. Is there a standard or better "practice" (maybe some server configuration?) for preventing or dealing with his concern?

Edit: Just for clarification. I'm referring to some person coding a software (with a cookie or is logged in) that just sends millions of requests per second to all my PHP post request files on my site.

The solution for this is to rate-limit requests, usually per client IP.

Most webservers have modules which can do this, so use one of them - that way your application only receives requests it's suppsed to handle.


There are many things you can do:

  • Use tokens to authenticate request. Save token in session and allow only some amount of requests per token (eg. 20). Also make tokens expire after some amount of time (eg. 5 min). The exact values depend on your site usage patterns. This of course will not stop attacker, as he may refresh the site and grab new token, but it is a small and almost costless aggravation.

  • Once you have tokens, require captcha after several token refresh requests. Also adjust it to your usage patterns to avoid displaying captcha to regular users.

  • Adjust your server's firewall rules. Use iptables connlimit and recent modules (see http://ipset.netfilter.org/iptables-extensions.man.html ). This will reduce request ratio handled by your http server, so it will be harder to exhaust resources.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM