简体   繁体   English

PHP:处理数千个条目,一定数量后脚本死亡

[英]PHP: Processing thousands of entries, script dies after certain amount

I'm calling an MLS service that responds with 4000+ records ... and I need to process each and every one of them, as well as insert all of the meta data per listing. 我正在调用一个MLS服务,该服务以4000多个记录作为响应...,我需要处理每个记录,并在每个列表中插入所有元数据。

I'm able to get to about 135 (* 150 meta records) and then the script apparently stops responding, or at least stops processing the rest of the data. 我能够达到大约135个(* 150个元记录),然后该脚本显然停止响应,或者至少停止处理其余数据。

I've added the following to my .htaccess file: 我在.htaccess文件中添加了以下内容:

php_value memory_limit 128M

But this doesn't seem to help me any. 但这似乎无济于事。 Do I need to process chunks of the data at a time, or is there another way to ensure that the script will indeed finalize? 我是否需要一次处理数据块,还是有另一种方法来确保脚本确实完成?

You should probably enable display_errors and error_reporting to get a better analysis of why the script isn't processing. 您可能应该启用display_errorserror_reporting以便更好地分析脚本error_reporting的原因。

However, you should also consider making sure the time limit isn't being hit by calling: 但是,您还应该考虑通过调用以下命令来确保未达到时间限制:

set_time_limit( 0 );

This will give you an unlimited time period. 这将给您无限的时间。 You can also just set it to something relatively high, like 600 (10 minutes) 您也可以将其设置为相对较高的值,例如600 (10分钟)

It isn't the memory- it's most likely the script execution time. 不是内存,而是脚本执行时间。

Try adding this to your htaccess, then restart apache: 尝试将其添加到您的htaccess中,然后重新启动apache:

php_value max_execution_time 259200

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM