简体   繁体   English

PHP获取HTTP响应代码

[英]PHP Get HTTP Response Code

I have found a way to get the HTTP response code from a URL using get_headers($url) . 我找到了一种使用get_headers($url)从URL获取HTTP响应代码的方法。 This function returns an array like the following... 该函数返回如下数组:

Array
(
    [0] => HTTP/1.1 200 OK
    [1] => Date: Sat, 29 May 2004 12:28:13 GMT
    [2] => Server: Apache/1.3.27 (Unix)  (Red-Hat/Linux)
    [3] => Last-Modified: Wed, 08 Jan 2003 23:11:55 GMT
    [4] => ETag: "3f80f-1b6-3e1cb03b"
    [5] => Accept-Ranges: bytes
    [6] => Content-Length: 438
    [7] => Connection: close
    [8] => Content-Type: text/html
)

My problem is that I may have a large list of URLs that I want to loop though and get the HTTP response code for each URL. 我的问题是我可能要遍历许多URL,并获取每个URL的HTTP响应代码。 It seems like a nasty and slow way to use this function inside a loop for potentially 100's of URLs. 在一个可能包含100个URL的循环中使用此函数似乎是一种令人讨厌且缓慢的方法。

How can I speed up this process and make it cleaner or is this the best way to do it? 我如何才能加快此过程并使它更清洁,或者这是最好的方法? I would love to know your suggestions. 我很想知道您的建议。

Thanks 谢谢

The network calls inherently take time, but you can get to completion faster by running those calls in parallel. 网络呼叫本质上需要时间,但是您可以通过并行运行这些呼叫来更快地完成呼叫。 One way to do that is to use curl_multi. 一种方法是使用curl_multi。 Give me a minute and I'll write out an example. 给我一点时间,我将写一个例子。

//set up list of urls and arrays to hold responses
$urls = array(
    'http://www.livestrong.com/',
    'http://www.apple.com/'
    //add more urls here
);

$response_map = array();
$responses_by_url = array();

//create the multi object
$multi = curl_multi_init();
foreach($urls as $url) {
    //add a request for each url
    $ch = curl_init($url);
    $response_map[$ch] = $url;
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
    curl_setopt($ch, CURLOPT_HEADER, true);
    //since that's all you need, we'll save some bandwidth by just asking for the HEAD
    curl_setopt($ch, CURLOPT_CUSTOMREQUEST, 'HEAD');
    curl_multi_add_handle($multi, $ch);
}

//start the multi request
$still_running = 0;
curl_multi_exec($multi, $still_running);

//loop while waiting for completion
do {
    curl_multi_select($multi); //blocks until state change
    curl_multi_exec($multi, $still_running); //get new state

    //read all available new information
    while ($info = curl_multi_info_read($multi)) {
        if ($info['msg'] === CURLMSG_DONE) {
            //we're done, check the result
            if ($info['result'] === CURLE_OK) {
                //result ok, parse it
                $url = $response_map[$info['handle']];
                $header_text = curl_multi_getcontent($info['handle']);
                curl_multi_remove_handle($multi, $info['handle']);
                $header_array = explode("\r\n", trim($header_text));
                $responses_by_url[$url] = $header_array;
            } else {
                //record error
                $responses_by_url[$url] = "error: " . curl_error($ch);
            }
        }
    }
} while ($still_running);

//clean up
curl_multi_close($multi);

//output results
var_dump($responses_by_url);

You need to use curl_multi_init() to perform these 100 requests faster. 您需要使用curl_multi_init()来更快地执行这100个请求。 There is a small php lib php-multi-curl which can help you do the task. 有一个小的php lib php-multi-curl ,可以帮助您完成任务。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM