I have a script that reads an XML file and inserts the data in mysql database. My issue is its only inserting one record and I have 60 000 rows of data to be inserted and I want it to be faster than taking an One hour to insert the rows.
My script
$db_link = mysql_connect('localhost', 'root', '');
$db = mysql_select_db('my_db');
//SIMPLEXML: Cleaned file is opened
$xml_source='cleanme.xml';
$xml=simplexml_load_file($xml_source);
//Reading each tag in the xml file
foreach($xml->Property as $prop){
echo 'Reference '.$prop->Reference.'<br>';
$ref_id=$prop->Reference;
//Reading sub tags in the xml file
foreach($prop->Images->Image as $chk)
{
echo 'REF_ID '.$ref_id.' '.'ImageID '.$chk->ImageID.'<br>';
$sql_refid = $ref_id;
$sql_link =$chk->ImageID;
//Inserts data into to the database
$sql.="INSERT INTO prop_ref (id, ref, link) VALUES (NULL, '{$sql_refid}','{$sql_link}')";
}
}
mysql_query($sql);
echo 'Complete';
Shard your data into chunks, either by # of records per chunk (I prefer this), or by dividing your data into n
sets, then do a batch insert, eg
INSERT INTO `table_name` (id, ref, link)
VALUES (NULL, '{$sql_refid}', '{$sql_link}')
, (NULL, '{$sql_refid}', '{$sql_link}')
, (NULL, '{$sql_refid}', '{$sql_link}')
, (NULL, '{$sql_refid}', '{$sql_link}')
, (NULL, '{$sql_refid}', '{$sql_link}')
, (NULL, '{$sql_refid}', '{$sql_link}')
, (NULL, '{$sql_refid}', '{$sql_link}')
, (NULL, '{$sql_refid}', '{$sql_link}')
Update:
And for sharding , here is one implementation:
$shardSize = 500;
$sql = '';
foreach ($data as $k => $row) {
if ($k % $shardSize == 0) {
if ($k != 0) {
mysqy_query($sql);
}
$sql = 'INSERT INTO `dbTable` (id, ref, link) VALUES ';
}
$sql .= (($k % $shardSize == 0) ? '' : ', ') . "(NULL, '{$row['refid']}', '{$row['link']}')";
}
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.