I am running a very large script (importing data from a CSV file to a database). The CSV file contains thousands of rows, hence why the script times out.
The script is also being run on a shared-server (if this helps).
I have tried changing the max_execution_time
to 60, set_time_limit
to 0 (and 300) and finally I tried using ignore_user_abort(true)
to check if this continues the script after the user aborts.
Could somebody please advise?
Thank you
Execute the script from your SSH shell. Like: php yourfile.php. Then you won't endup with this timeout shizzles.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.