简体   繁体   中英

How do i Prevent remote file inclusion attack from php?

This is my code in index.php

include ($_GET['page']);

Actully i need to include page from url like

"?page=go.php"

on the other-hand i can not filter

"?page=example.com"

as for some case i need to include this value also. But this is a remote file inclusion (RFI) vulnerability. how can i prevent RFI attack from my site? I am doing something like

$filename = $_GET['page'];
if (file_exists($filename)) {
{
include ($_GET['page']);
}

But it filters only

"?page=go.php"

this shorts of page. And i am sucked with

"?page=example.com"

this shorts of page.

If I understand the question correctly; You could setup an array with 'allowed' pages such as:

  $allowedPages = array('go.php', 'stop.php', 'file.php');
  $filename = $_GET['page'];

  if(in_array($filename, $allowedPages) && file_exists($filename)){
    include ($filename);
  }else{
    //output error
  }

I think this answer is too late, but for those who might search for this problem, I guess it can be done like this, too:

1.Define a constant with full path.

2.Define a white-list of allowed pages.

3.Get the $_GET variable, and convert it to lower case.

4.Then if the page returned by $_GET variable is in your white-list array, then require it, otherwise, redirect the user to the home page, and display an error message.

<?php
# this is abcd.php
define('SITE','http://www.yoursite.com/');
$allow = [SITE.'1.php', SITE.'2.php', SITE.'3.php'];
$get = strtolower(SITE.$_GET['page']);

if(in_array($get,$allow)){
    include $get;
} else {
    header('Location: index.php?param=incorrect');
}
?>

<?php
# this is index.php
if(isset($_GET['param']) && $_GET['param'] == 'incorrect'){
?>
    <script type="text/javascript">
        alert("INCORRECT PARAMETER PROVIDED");
    </script>
<?php
} else die('ERROR');

I'm not 100% sure this will work without any probs, but I guess is a good start and you can play around with it and figure out what's suitable for you.

To be honest, your method of creating a dynamic website is definitely not the way to go.

To answer within the scope of this question, you'd do something like the following:

You'd have to set up a whitelist of files that are**ALLOWED** to be included through this function.

That could look something like this:

<?php 

$whitelist = array(
    'file1.php',
    'file2.php',
    'file3.php',
    'file4.php',
    'file5.php',
);

?>

Now before including the said file, you'd run a check with in_array()

<?php 

if(in_array($_GET['page'] . '.php', $whitelist) && file_exists($_GET['page'] . '.php')) {
    include($_GET['page'] . '.php');
}

?>



This, as you can see is not very pretty! Another alternative would be doing something like:

    <?php 
    $file = strtolower($_GET['page']) . '.php';
    if(isset($whitelist[$file]) && file_exists($file)) {
        include($_GET['page'] . '.php');
    }

    ?>

Don't accept a page.php parameter, but just the name of the file.

"?page=go"

Then check if $_REQUEST["page"] is alphanumeric

if (ctype_alnum($_REQUEST["page"]))

And just don't give non-alpha-numeric names to your files.

Then do if file_exists on $_REQUEST["page"] and you should be quite good.

PS $_REQUEST["page"] is about the same with $_GET["page"], it just intersects with $_POST

You can filter other domain urls instead of filtering files. If parameter contains a hostname less than 3 letters, it is fine as no domain is 2 letters.

   $tmp= parse_url ($_GET['page']);
   if(strlen($tmp['host'])<3)
    include($_GET['page']);

If there are some trusted hosts then you can validate them too.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM