[英]How can I create an array from a huge amount of data?
I am selecting a lot of data from my mySQL database: 我正在从mySQL数据库中选择很多数据:
$sql = "SELECT * FROM data WHERE id = ?";
$q = $pdo->prepare($sql);
$q->execute([$id]);
$array = $q->fetchAll(PDO::FETCH_ASSOC);
var_dump($array);
I want to store this data into an array and after work with this array in a loop. 我想将此数据存储到一个数组中,并在循环后使用此数组。 My problem is now, that I have such an immense amount of data, that the array is loading and loading and my system is overwhelmed. 现在的问题是,我有大量的数据,正在加载和加载阵列,而我的系统不堪重负。 Is there a way to create an array with a huge amount of data with a better performance? 有没有一种方法可以创建包含大量数据且性能更好的阵列?
The fetchAll()
function maps the whole record to the variable. fetchAll()
函数将整个记录映射到变量。
Mapping one row per iteration will be considerably faster 每次迭代映射一行会更快
$q->execute([$id]);
$i = 0;
while ($row = $q->fetch()) {
// do something with $row
$i++;
}
Please use fetch function instead of fetching all like below: 请使用提取功能,而不是像下面这样提取所有内容:
while ($arr = $stmt->fetch()){
// do_other_stuff();
}
This will reduce the load on your system. 这将减少系统的负载。
I am currently processing 12,000 rows and encoding it into JSON arrays and in my opinion this is working best for me. 我目前正在处理12,000行并将其编码为JSON数组,我认为这对我来说是最好的。
$sql = mysqli_query($dbconn, "SELECT * FROM data WHERE id = ?");
$rows = array();
while($r = mysqli_fetch_assoc($sql)) {
$rows[] = $r;
}
Then echoing (or in my case echoing json_encode) it out. 然后回显(或者在我的情况下回显json_encode)。
The main thing to consider is: 要考虑的主要事项是:
Looping 500k records each time is not advisable/recommended. 不建议/建议每次循环500k条记录。 It definitely affects the speed/performance. 它肯定会影响速度/性能。
I suggest the following options: 我建议以下选择:
PS: you can think of using a CRON to process the data and update the sort order if needed. PS:您可以考虑使用CRON处理数据并根据需要更新排序顺序。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.