This is my array:
array(4) {
[0]=>
array(500000) {
["1234"]=>
array(3) {
["fileName"]=>
string(10) "monkey.jpg"
["path"]=>
string(20) "animales/monkey.jpg"
["dateTime"]=>
string(19) "2016-10-12 19:46:25"
}
["3456"]=>
array(3) {
["fileName"]=>
string(9) "horse.jpg"
["path"]=>
string(19) "animales/horse.jpg"
["dateTime"]=>
string(19) "2016-10-12 19:46:25"
}
.... and many more...
}
... and many more...
}
I want to store the content into my database:
$sql = "INSERT INTO files (id,fileName,path,dateTime) values(?,?,?,?) ";
foreach($array as $key => $value){
if(is_array($value)){
foreach($value as $key => $v){
foreach($v as $k => $item){
if(is_array($v)){
$s = str_replace("\\","",$v['dateTime']);
$d = strtotime($s);
$dateTime = date('Y.m.d H:i:s', $d);
$q->execute(array($key,$v['fileName'],$v['path'],$dateTime));
}
}
}
}
}
My problem is, that I have over 500.000 entries. So my system crashes. I think it is because there are so many loops inside the loop. Is there a way to read the content with only one loop or some other way faster?
Note: the $array
is a spliced array created like this ($array[] = array_splice($orinal_array, 0,count($original_array)); I actually did that to make the system faster
Please have a look at this answer:
MYSQL import data from csv using LOAD DATA INFILE
You should convert your data to csv and rely on LOAD DATA INFILE
Not that you should upload to your mysql server the csv file to rely on this Mysql functionality
Using serialize or json_encode is probably a way to go. This way, you won't have to traverse all elements and handle slashes, because all becomes a string that can be later read using json_decode or deserialize PHP functions.
Side note: please use meaningful variable names so that people helping you won't have to figure out what you meant. ie:
foreach($v as $k => $item){
is a bit worse than
foreach($fileCollection as $fileIterator => $fileDetails){
If you REALLY need to traverse all the data and store each file property in a separate column, all you need is 2x foreach (one for collection, and one for each file).
foreach($globalCollection as $fiveHundredThousandRows){
foreach ($fiveHundredThousandRows as $fileIterator => $fileData){
$timestamp = strtotime($fileData['dateTime']);
$q->execute(array($fileIterator,$fileData['fileName'],$fileData['path'],date( 'Y.m.d H:i:s', $timestamp)));
}
}
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.