简体   繁体   English

通过平面文件读取738627记录会导致内存耗尽错误

[英]Reading in 738627 records via flat file gives memory exhaustion error

I am trying to read in a simple flat file, it has 738,627 records in it, a sample of the file looks like this: 我正在尝试读取一个简单的平面文件,其中有738,627条记录,该文件的示例如下所示:

#export_dategenre_idapplication_idis_primary
#primaryKey:genre_idapplication_id
#dbTypes:BIGINTINTEGERINTEGERBOOLEAN
#exportMode:FULL
127667880285760002817317350
127667880285760002818261461
127667880285760002825372301
127667880285760002827785570
127667880285770193778591110
127667880285770193778771240
127667880285770193779116230
127667880285770193779482590
127667880285770193779623800
127667880285770193780516840
#recordsWritten:738627

My relevant PHP code looks like this 我相关的PHP代码如下所示

 ini_set("memory_limit","40M"); 
$fp1 = fopen('genre_application','r');
if (!$fp) {echo 'ERROR: Unable to open file.'; exit;}

$loop = 0;
while (!feof($fp1)) {
  $loop++;
    $line = stream_get_line($fp1,128,$eoldelimiter); //use 2048 if very long lines
if ($line[0] === '#') continue;  //Skip lines that start with # 
    $field[$loop] = explode ($delimiter, $line);
list($export_date, $genre_id, $application_id, $is_primary ) = explode($delimiter, $line);

// does application_id exist? 
$application_id = mysql_real_escape_string($application_id); 
$query = "SELECT * FROM jos_mt_links WHERE link_id='$application_id';"; 
$res = mysql_query($query); 
if (mysql_num_rows($res) > 0) { 
 echo $application_id . "application id" . $link_id . "\n";
} else 
{
// no, application_id doesn't exist 
echo $loop . "\n";
}

} //close reading of genre_application file
fclose($fp1);

The last output on my screen is as follows, so it's not even getting through the first 100,00 records. 屏幕上的最后一个输出如下,因此甚至没有通过前100,00条记录。 Is there a way to prevent the script running out of memory? 有没有办法防止脚本用完内存?

81509
81510
81511
81512
81513
81514
81515
81516

PHP Fatal error: Allowed memory size of 41943040 bytes exhausted (tried to allocate 14 bytes) in /var/www/vhosts/smartphonesoft.com/httpdocs/fred/xmlfeed/test/text_to_mysql.php on line 156 PHP致命错误:在第156行上的/var/www/vhosts/smartphonesoft.com/httpdocs/fred/xmlfeed/test/text/to_mysql.php中,耗尽了41943040字节的允许内存大小(尝试分配14字节)

Fatal error: Allowed memory size of 41943040 bytes exhausted (tried to allocate 14 bytes) in /var/www/vhosts/smartphonesoft.com/httpdocs/fred/xmlfeed/test/text_to_mysql.php on line 156 致命错误:在第156行上的/var/www/vhosts/smartphonesoft.com/httpdocs/fred/xmlfeed/test/text/to_mysql.php中,已用完41943040字节的内存容量(尝试分配14字节)

You seem to be storing every line in an array initialized outside the script you quote: 您似乎将每一行存储在引用脚本之外初始化的数组中:

 $field[$loop] = explode ($delimiter, $line);

why? 为什么? This is bound to grow with every loop, until the 40MB limit is hit. 这必然会随着每个循环而增长,直到达到40MB的限制。

I think it'd work if you removed that, or changed it to a mere $field = ... . 我认为,如果您删除了它或将其更改为仅$field = ... ,那将是可行的。

You're populating $field[$loop] for every line read from the file, so this array is growing every iteration of the loop. 您正在为从文件读取的每一行填充$ field [$ loop],因此此数组在循环的每次迭代中都在增长。 Do you actually need this growing in memory? 您是否真的需要内存增加?

You kind of answer your own question... You want to store 738,627 arrays of text into memory. 您可以回答自己的问题...您想将738,627个文本数组存储到内存中。 Is there a reason you keep all the exploded lines in field$? 您是否将所有分解线都保留在field $中有原因吗?

You can raise the amount of available memory like this: ini_set("memory_limit","64M"); 您可以像这样增加可用内存量: ini_set("memory_limit","64M"); ... just put that at the top of your PHP file. ...只需将其放在您的PHP文件的顶部即可。 If it's still not enough memory, bump it up to 128 or 256 megs. 如果仍然没有足够的内存,则将其增加到128或256兆。

In your specific case, though, you're not using the available memory efficiently... see Pekka's answer. 但是,在您的特定情况下,您没有有效地利用可用内存...请参阅Pekka的答案。

Where you have... 你在那里...

$res = mysql_query($query);

Didn't you ought to have?... 你不应该有吗?

if (isset ($res))
{
    mysqli_free_result ($res);
}

$res = mysql_query($query);

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM