简体   繁体   English

PHP-处理大型数据集时超出允许的内存

[英]PHP - Exceeding allowed memory when processing large dataset

I have a list data with 999,000 records . 我有一个包含999,000条记录的列表数据。

I have a select query and a while loop to get the data, I use array_push to add the retrieved value in loop into one array. 我有一个选择查询和一个while循环来获取数据,我使用array_push将循环中检索到的值添加到一个数组中。

And then I want it so every loop processes 1000 values in this array. 然后我想要它,以便每个循环在此数组中处理1000个值。

My problem is when use array_push with big data I get the error: 我的问题是在对大数据使用array_push时出现错误:

Fatal Error: Allowed Memory Size of 134217728 Bytes

How can I optimize my code to resolve my problem? 如何优化代码以解决问题?

My code is below: 我的代码如下:

 $sql = "select customer_id";
 $sql .= " from";
 $sql .= "  t_customer t1";
 $sql .= "  inner join t_mail_address t2 using(mid, customer_id)";
 $result = $conn->query($sql);
 $customerArray = array();
 while ($row = $result ->fetch(PDO::FETCH_ASSOC)) {
    array_push($customerArray , $row);
 }
 // Execute every 1000 record 
 foreach(array_chunk($customerArray , 1000) as $execCustomerArray ) { 
   // My code to execute for every records.
   // ....
 }

I'm unsure if it would fix anything, but one thing I will say is, your use of pushing all records into an array is silly. 我不确定它是否可以解决任何问题,但是我要说的一件事是,您将所有记录都推入数组的做法很愚蠢。

You're using fetch to fetch them one by one, then adding them all to an array, why on earth aren't you just using PDOStatement::fetchAll() ? 您正在使用fetch一步一步地获取它们,然后将它们全部添加到数组中,为什么您不仅仅使用PDOStatement::fetchAll()呢?

Example: 例:

 $sql = "select customer_id";
 $sql .= " from";
 $sql .= "  t_customer t1";
 $sql .= "  inner join t_mail_address t2 using(mid, customer_id)";
 $result = $conn->query($sql);
 $customerArray = $result->fetchAll(PDO::FETCH_ASSOC);
 // Execute every 1000 record 
 foreach(array_chunk($customerArray , 1000) as $execCustomerArray ) { 
   // My code to execute for every records.
   // ....
 }

This may not fix your memory issue, because we can't see what the heavy lifting is for every customer record, but I will say that while loop you had was silly but most likely not the cause of your memory issue 这可能无法解决您的内存问题,因为我们看不到每个客户记录的繁重工作,但是我会说,虽然您的循环很愚蠢,但很可能不是造成内存问题的原因

Depending on if this is a script, or a web page thing, you could also have an incremental loop sort of thing, and use the MySQL LIMIT function to implement basic paging for your data , thus preventing it from coming into memory all at once, 根据这是脚本还是网页,还可以使用增量循环之类的东西,并使用MySQL LIMIT函数为数据实现基本分页 ,从而防止其一次全部进入内存,

If using PDOStatement::fetchAll() will take many memory, it not good. 如果使用PDOStatement :: fetchAll()将占用很多内存,那就不好了。

My code after edit will below: 编辑后的代码如下:

 $sql = "select customer_id";
 $sql .= " from";
 $sql .= "  t_customer t1";
 $sql .= "  inner join t_mail_address t2 using(mid, customer_id)";
 $result = $conn->query($sql);
 $customerArray = array();
 while ($row = $result ->fetch(PDO::FETCH_ASSOC)) {
    array_push($customerArray , $row);

     // Execute every 1000 record 
    if (count($customerArray) == 1000) {
      // My code to execute for every records.
      // ...
      $customerArray = array();
    }
 }

But I don't know how to execute for last 99 records in 999.000 records if array $customerArray have exist data. 但是我不知道如果数组$ customerArray存在数据,如何对999.000条记录中的最后99条记录执行。 I don't want to use "select count" because performance. 我不想使用“选择计数”,因为性能。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM