[英]Bulk data insert php mysql speed
我有一個腳本,可以讀取XML文件並將數據插入mysql數據庫中。 我的問題是它僅插入一條記錄,並且我要插入6萬行數據,我希望它比花一個小時插入行要快。
我的劇本
$db_link = mysql_connect('localhost', 'root', '');
$db = mysql_select_db('my_db');
//SIMPLEXML: Cleaned file is opened
$xml_source='cleanme.xml';
$xml=simplexml_load_file($xml_source);
//Reading each tag in the xml file
foreach($xml->Property as $prop){
echo 'Reference '.$prop->Reference.'<br>';
$ref_id=$prop->Reference;
//Reading sub tags in the xml file
foreach($prop->Images->Image as $chk)
{
echo 'REF_ID '.$ref_id.' '.'ImageID '.$chk->ImageID.'<br>';
$sql_refid = $ref_id;
$sql_link =$chk->ImageID;
//Inserts data into to the database
$sql.="INSERT INTO prop_ref (id, ref, link) VALUES (NULL, '{$sql_refid}','{$sql_link}')";
}
}
mysql_query($sql);
echo 'Complete';
通過將數據分割成大塊,或者按每個大塊的記錄數(我更喜歡),或者將數據分成n
組,然后進行批量插入,例如
INSERT INTO `table_name` (id, ref, link)
VALUES (NULL, '{$sql_refid}', '{$sql_link}')
, (NULL, '{$sql_refid}', '{$sql_link}')
, (NULL, '{$sql_refid}', '{$sql_link}')
, (NULL, '{$sql_refid}', '{$sql_link}')
, (NULL, '{$sql_refid}', '{$sql_link}')
, (NULL, '{$sql_refid}', '{$sql_link}')
, (NULL, '{$sql_refid}', '{$sql_link}')
, (NULL, '{$sql_refid}', '{$sql_link}')
更新:
對於分片,這是一種實現:
$shardSize = 500;
$sql = '';
foreach ($data as $k => $row) {
if ($k % $shardSize == 0) {
if ($k != 0) {
mysqy_query($sql);
}
$sql = 'INSERT INTO `dbTable` (id, ref, link) VALUES ';
}
$sql .= (($k % $shardSize == 0) ? '' : ', ') . "(NULL, '{$row['refid']}', '{$row['link']}')";
}
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.