简体   繁体   中英

How to prevent 503 error while using curl_multi

I'm trying to fetch 10 webpages simultaneously.

I'm using curl_multi.

However i end up with a lot of 503 (too many requests) error on most of the fetched webpages. How can i fix this?

Here's the php script that i ran. http://pastebin.com/HhrffciC

You can run it on any php enabled server.

Here is what the output on my machine looked like. http://i.imgur.com/dhnXJwz.jpg

There is a library called ParallelCurl that can allow you to control how many simultaneous requests are sent out. The script below sets the maximum to 5 and simply sends a series of GET requests to the URLs in your code. If this displays 503 errors for you (it doesn't for me) you can lower $max_requests to your needs.

<?php

require __DIR__ . '/parallelcurl.php';

function on_request_done($content, $url, $ch, $search) {
    echo $content;
}

$data = array(
    'http://www.codechef.com/status/CLETAB,tacoder',
    'http://www.codechef.com/status/CRAWA,tacoder',
    'http://www.codechef.com/status/EQUAKE,tacoder',
    'http://www.codechef.com/status/MOU2H,tacoder',
    'http://www.codechef.com/status/PRGIFT,tacoder',
    'http://www.codechef.com/status/PUSHFLOW,tacoder',
    'http://www.codechef.com/status/REVERSE,tacoder',
    'http://www.codechef.com/status/SEASHUF,tacoder',
    'http://www.codechef.com/status/SIGFIB,tacoder',
    'http://www.codechef.com/status/TSHIRTS,tacoder'
);

$max_requests = 5;
$parallel_curl = new ParallelCurl($max_requests);

foreach ($data as $url) {
    $parallel_curl->startRequest($url, 'on_request_done');
}

$parallel_curl->finishAllRequests();

The GitHub README explains how to use the library further.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM