简体   繁体   中英

how to get a webpage using php curl in xampp

I have recently started working in PHP. In order to get some website content, I am using the following code. But this code always returns " Empty ". I am using xampp's PHP to run this code.

Also, I had already uncommented the expression " extension=php_curl.dll " from xampp/php/php.ini file.

Please help me out why it is not returning the webpage content? Also how to get the specific data from such web pages?

Here is my code:

<?php
$html=get_data('http://timesofindia.indiatimes.com/');
if (!empty($html))
    echo $html;
else
    echo "Empty";

function get_data($location){
    $ch = curl_init($location);
    curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
    curl_setopt($ch, CURLOPT_HTTPHEADER, array('Connection: close'));
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
    curl_setopt($ch, CURLOPT_TIMEOUT, 15);
    $response = curl_exec($ch);
    curl_close($ch);
    return $response;
    }
?>

How can i do this ?

You may use something like this instead (without or with a custom context):

$url = 'http://timesofindia.indiatimes.com/';
$result = file_get_contents($url, false, $context);
echo ( $result ) ? $result : 'Empty';

http://php.net/file_get_contents

$opts = array('http' => 
                array(
                    'method'  => 'GET',
                    'timeout' => 15
                )
        );
$context = stream_context_create($opts);

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM