简体   繁体   中英

PHP curl to self to avoid FastCGI timeout

So here is my dilemma. I need to pull several hundred API calls worth of data, parse them one at a time, and log matching data. My problem is this takes awhile and I'm on shared hosting and my FastCGI busy timeout cant be altered (Web Host won't do it because of shared hosting I believe). So I'm completely stumped on how to get around this. I can't do CLI because it's a user facing tool where they input a list of data and thats what I match against. So once the input is recieved I need the PHP to run by itself until completed (probably awhile like couple hours).

I tried everything and nothing works. At this point to try and trick the system I have the file being self-referential instead of a loop but that doesnt seem to work. I think thats my only way (unless someone has a better idea) and I'm trying to figure out how to make every call back on itself "restart" in the eyes of FastCGI. HELP!!

If you have access to exec then you can always create either another PHP script to actually do the execution or some other program or script to do it, and then call that script with exec so you can have it run on the machine instead of through FastCGI. You'd then want to use some sort of progress tracking in your script to keep track of how far it's gotten, or when it's done, and then have a page to check on the progress of a request :)

Note: This really isn't a great idea for a production solution, but it will work better than figuring out a recursive curl call :)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM