简体   繁体   中英

How can I mimic facebooks behavior when clicking an external link

Basically what I want to do is since I offer my users the ability to share links within there comments and posts on my site is have my site when a user clicks on an external link have it open the page within my page in a matter of speaking. Kinda like how facebook does. You will see the site in its entirety yet facebooks little navigation bar will remain at the top of the site you just opened.

I want to copy this behavior so I can moderate links shared by users flagging them if they are invalid or malicious. So I can turn them off. Right now I am already catching the links and storing them on a per user per link basis so I can moderate as needed. But in order for my users to flag a site currently they would have to go back to mine, and follow a process which is tedious. What I want to do is offer a mini navigation that will essentially have the option to flag on it if and when a user so desires. Also a means to offer them a direct link back to my site.

So I am trying to figure out whats the best way. Should I pull the entire contents of a page via something like cURL or should I have it in a frame like setting. Or whats the best way to do it thats in a manor of speaking cross platform and cross browser friendly to both desktop browsers and mobile browsers. I can foresee someone mucking me up maliously if I do something like cURL as all they have to do is dump some vile code in somewhere and since my sites picking it up and pulling it through a script maybe it will some how break my site I dunno, I dont use cURL often enough to know if theres any major risk.

So what say you stack? cURL method of some sort, frames, other? does anyone have a good example they can point me at?

If you use frames then some websites can jump out of them. If you use CURL you need to parse all urls (links, images, scripts, css) and change them to your own if you want to keep user within your site. So CURL seems more reliable but it requires you to do a lot of work and it generated more bandwidth to your site. If you want CURL based solution you can try to look on net for web proxy examples.

Here's a basic working code to get you started:

$url = isset($_GET['url']) ? $_GET['url'] : 'http://amazon.co.uk/'; 
$html = file_get_contents2($url);


$doc = new DOMDocument();
@$doc->loadHTML($html);
$xml = simplexml_import_dom($doc);

$host = 'http://' .parse_url($url, PHP_URL_HOST);
$proxy = 'http://' . $_SERVER['SERVER_NAME'] . $_SERVER['SCRIPT_NAME'] . '?url=';

$items['a']         = 'href';
$items['img']       = 'src';
$items['link']      = 'href';
$items['script']    = 'src';

foreach ($items AS $tag=>$attr)
{
    $elems = $xml->xpath('//' . $tag);
    foreach ($elems AS &$e)
    {
        if (substr($e[$attr], 0, 1) == '/')
        {
            $e[$attr] = $host . $e[$attr];
        } 
        if ($tag == 'a')
        {
            $e[$attr] = $proxy . urlencode($e[$attr]);
        }   
    }
}

$xmls = $xml->asXml();
$doc->loadXML($xmls);
$html = $doc->saveHTML();
echo $html;


function file_get_contents2($address)
{   
    $useragent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.1) Gecko/20061204 Firefox/2.0.0.1"; 

    $c = curl_init(); 
    curl_setopt($c, CURLOPT_URL, $address);
    curl_setopt($c, CURLOPT_USERAGENT, $useragent);     
    curl_setopt($c, CURLOPT_HEADER, 0);
    curl_setopt($c, CURLOPT_RETURNTRANSFER, 1);
    curl_setopt($c, CURLOPT_FOLLOWLOCATION, 1);
    curl_setopt($c, CURLOPT_FRESH_CONNECT, 1);
    if (!$data = curl_exec($c)) 
    {
        return false; 
    } 

    return $data;
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM