Im using the Google pagerank checking script found here:
http://www.off-soft.net/en/develop/php/prcheck.html
I've noticed however that after too many requests the server gets a temporary ban.
I'd like to somehow route requests through a list of proxy servers - can anyone get me started?
I'm looking for any code examples of php requests using a list of proxies.
Thanks!!
PHP's curl library allows you to use socks5 and http proxies. A list of proxy servers should be verified using a tool like YAPH before using them.
The temporary ban is to prevent abuse. Using proxies to bypass the ban isn't exactly a nice thing to do. So, you're not likely to find anyone here who'll help you violate that site's TOS.
That being said, a proxy for HTTP is just a webserver that'll process/honor requests for foreign/outside URLs and return the results. The rest is left as an exercise to the asker.
If you don't mind using paid API, then gimmeproxy.com will work for you.
<?php
function getProxy() {
$data = json_decode(file_get_contents('http://gimmeproxy.com/api/getProxy?api_key=YOUR_API_KEY'), 1);
if(isset($data['error'])) { // there are no proxies left for this user-id and timeout
echo $data['error']."\n";
}
return isset($data['error']) ? false : $data['curl']; //gimmeproxy returns 'curl' field that is CURLOPT_PROXY-ready string, see curl_setopt($curl, CURLOPT_PROXY, $proxy);
}
function get($url) {
$curlOptions = array(
CURLOPT_CONNECTTIMEOUT => 5, // connection timeout, seconds
CURLOPT_TIMEOUT => 10, // total time allowed for request, second
CURLOPT_URL => $url,
CURLOPT_SSL_VERIFYPEER => false, // don't verify ssl certificates, allows https scraping
CURLOPT_SSL_VERIFYHOST => false, // don't verify ssl host, allows https scraping
CURLOPT_FOLLOWLOCATION => true, // follow redirects
CURLOPT_MAXREDIRS => 9, // max number of redirects
CURLOPT_RETURNTRANSFER => TRUE,
CURLOPT_HEADER => 0,
CURLOPT_USERAGENT => "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36",
CURLINFO_HEADER_OUT => true,
);
$curl = curl_init();
curl_setopt_array($curl, $curlOptions);
if($proxy = getProxy()) {
echo 'set proxy '.$proxy."\n";
curl_setopt($curl, CURLOPT_PROXY, $proxy);
}
$data = curl_exec($curl);
curl_close($curl);
return $data;
}
while(true) {
$data = get('https://news.ycombinator.com/');
if(trim($data) && stripos($data, 'Hacker News') !== false) {
echo "hacker news works fine";
break;
} else {
echo "hacker news banned us, try another proxy\n";
}
}
Sample PHP CURL request using a Squid Proxy:
$proxy = "1.1.1.1:12121";
$useragent="Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.1) Gecko/20061204 Firefox/2.0.0.1";
$url = "http://www.google.pt/search?q=anonymous";
$ch = curl_init();
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT,15);
curl_setopt($ch, CURLOPT_HTTP_VERSION,'CURL_HTTP_VERSION_1_1' );
curl_setopt($ch, CURLOPT_HTTPPROXYTUNNEL, 1);
curl_setopt($ch, CURLOPT_PROXY, $proxy);
curl_setopt($ch, CURLOPT_PROXYUSERPWD,'USER:PASS');
curl_setopt($ch, CURLOPT_USERAGENT,$useragent);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION,1);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER,0);
$result = curl_exec ($ch);
curl_close ($ch);
Learn how to implement your own squid proxy with rotating outgoing ip's here
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.