简体   繁体   中英

PDO and LOAD DATA LOCAL INFILE not working

I just tried to use LOAD DATA LOCL INFILE with pdo. Didn't worked out for me. Here my function

function connect($db_name,$db_host,$db_user,$db_pass)
{
    try
    {
        $this->connect = new PDO("mysql:host=$db_host;dbname=$db_name", $db_user, $db_pass);    
        $this->connect->exec("LOAD DATA LOCAL INFILE 'http://localhost/testoo.csv'
                                INTO TABLE 'parsed'
                                FIELDS TERMINATED BY ','
                                OPTIONALLY ENCLOSED BY '\"'
                                LINES TERMINATED BY '\n'
                                ('name','link','price','brand','imageurl')");

    }
    catch(PDOException $e) 
    {  
        echo $e->getMessage(); 
    }
}

For now nothing happens. The same query works with normal mysql_query. Any pointers for this issue?

Set attribute PDO::MYSQL_ATTR_LOCAL_INFILE in PDO connection options:

function connect($db_name,$db_host,$db_user,$db_pass)
    {
        try
        {
            $this->connect = new PDO("mysql:host=$db_host;dbname=$db_name", $db_user, $db_pass,array(PDO::MYSQL_ATTR_LOCAL_INFILE => true));    
            $this->connect->exec("LOAD DATA LOCAL INFILE 'http://localhost/testoo.csv'
                                    INTO TABLE 'parsed'
                                    FIELDS TERMINATED BY ','
                                    OPTIONALLY ENCLOSED BY '\"'
                                    LINES TERMINATED BY '\n'
                                    ('name','link','price','brand','imageurl')");

        }
        catch(PDOException $e) 
        {  
            echo $e->getMessage(); 
        }
    }

I had the same problem. My MySQL server had the right local-infile conf, PHP/PDO had the right PDO::MYSQL_ATTR_LOCAL_INFILE conf too. The solution was to (re)install php5-mysqlnd.

$> apt-get update
$> apt-get install php5-mysqlnd

...and it worked :)

2019 Edit

7-years-later Sammitch here to say that my original answer is gross. I can't even figure out what the heck I was talking about with " fgetcsv() resource usage issues". It's possible that 7 years ago PHP was lacking some of the IO stream optimizations that is has today, but I'm willing to be that it was resource constraints unrelated to PHP.

Jay Dhameliya's answer below is most likely the way you want to go. LOAD DATA INFILE should blast the data directly into mySQL as fast as could be possible.

For the sake of completeness, assuming that there is something preventing the use of LOAD DATA INFILE [like the giant security hole recently uncovered] and you want to efficiently load data from a file, you'll likely want to leverage transactions to batch IO and index writes. Eg:

$fname = 'myfile.csv';

if( ! $fh = fopen($myfile, 'r') ) {
    throw new Exception("Could not open $fname for reading.");
}

$dbh = new PDO(...);
$dbh->beginTransaction();
$stmt = $dbh->prepare('INSERT INTO table VALUES (?,?,?,...)')
try {
    while( $params = fgetcsv($fh) ) {
        $stmt->execute($params);
    }
} catch( \PDOException $e ) {
    $dbh->rollBack();
    throw $e;
}
$dbh->commit();

Having everything batched into a single transaction is still part of the reason with LOAD DATA INFILE is so fast, as well as likely being a large part of @Benjamin's suggestion of using extended inserts.

Original Gross Answer

  1. LOAD DATA LOCAL INFILE forbidden in... PHP
  2. Make sure that both the mySQL and www users have access to the file in question.

Alternatively: Use fgetcsv() and create the inserts programmatically.

edit:

To avoid the resource usage issues with fgetcsv() [because it tries to read the whole file at once] you can create a loop similar to below to read/insert manageable chunks.

<?php
$fname = 'myfile.csv';
$chunksize = 50;

if( ! $fh = fopen($myfile, 'r') ) {
    throw new Exception("Could not open $fname for reading.");
}

$i=0;
$buffer = array()
while(!feof($fh)) {
    $buffer[] = fgets($fh);
    $i++;
    if( ($i % $chunksize) == 0 ) {
        commit_buffer($buffer);
        //run inserts
        $buffer = array(); //blank out the buffer.
    }
}

//clean out remaining buffer entries.
if( count($buffer) ) { commit_buffer($buffer); }

function commit_buffer($buffer) {
    foreach( $buffer as $line ) {
        $fields = explode(',', $line);
        //create inserts
    }
    //run inserts
    $buffer = array(); //blank out the buffer.
}

In this way only $chunksize lines are held in memory at any given time.

You'll likely need additional code to handle things like encapsulated strings containing commas and line breaks, but if you can't get LOAD DATA LOCAL INFILE working I don't see much other choice.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM