简体   繁体   中英

Whats the limit on asynchronous request to a PHP script making CURL requests

As the title states: what is the number of asynchronous request that can be made to a PHP script that makes CURL requests and returns some information, or approximate number if hardware dependent.

For example:

A piece of Javascript runs in a loop 50 times making asynchronous get requests via AJAX to the same PHP script, storing the result in an array as they are returned.

How many such requests could reasonably made or can the server only think about one CURL request at a time?

What you're looking for are the curl_multi_* functions. An example would be like this:

//create the multiple cURL handle
$mh = curl_multi_init();

// Loop over pages and get set the URL to the cURL queue
foreach ($urls as $url) {

    // Get a cURL handle of the current URL
    $ch = curl_init();
    curl_setopt($ch, CURLOPT_URL, $url);
    // Set all your other curl_setopt() options here for the transfers


    // Success
    if (gettype($ch) == 'resource' && get_resource_type($ch) == 'curl') {
        curl_multi_add_handle($mh, $ch);
    }
}

// Execute the handles
do {
    $mrc = curl_multi_exec($mh, $active);
} while ($mrc == CURLM_CALL_MULTI_PERFORM);

while ($active && $mrc == CURLM_OK) {
    if (curl_multi_select($mh) != -1) {
        do {
            $mrc = curl_multi_exec($mh, $active);
        } while ($mrc == CURLM_CALL_MULTI_PERFORM);
    }
}

I am using that on one of my sites to download around 50 URLs and it works perfectly. Of course, it all depends on the allocated memory for your script. You can also set a timeout so that no URLs can make your script hang for too long.

Edit: To actually answer your question, I'm not sure what the limit of the number of URLs you can input into this function is. I would believe that it varies from server to server depending on connection speed, and memory and possibly CPU if you're doing something processor intensive with the result. But with that said, if all of the requests are being made to the same remote server, you may run into bottleneck issues depending on its configuration. This is independent of whether you are using PHP multiple cURL requests or AJAX. All web servers are made to handle many concurrent requests even to the same script, but you can still run into issues with any of the following situations:

  • Each of the requests sent are very resource consuming
  • The remote server needs to lock the database, in which case all requests will be executed one by one, so there would be very little difference in sending them all at once or one at a time.
  • The remote server uses file-based sessions in PHP and your requests require sessions
  • The remote server is set up to allow very few maximum concurrent clients (generally Nginx handles many concurrent clients better than Apache since it doesn't need to fork a new process for every request, so if it becomes an issue you might want to look into switching web servers). In this case, all subsequent requests will be queued up to a certain hard limit amount and above that, they will be dropped.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM