简体   繁体   中英

cURL isn't running script as a background process

I've been messing around with cURL all day, and can NOT get it to behave in any manner as I've read other people have it working. No matter how I do it, my entire website hangs until it's finished processing. I've even tried running it like:

exec("curl -d $params $url -k &");

It STILL didn't run as a background process. Now, if I do this in the shell, it's fine. I'm completely lost now. I've tried so many methods today and can not get this to run as a background process. I'm so frustrated right now that it's not working the way it should be that I'm probably missing something small, but I just can't find it. The code is simple (for now):

$url = "path_to_script.php";
$params = "id=$id&etc=etc";
$command = "curl -d \"$params\" \"$url\" -k &";
exec($command);

The script that's being called is this:

$id = $_POST['id'];
$etc = $_POST['etc'];
$temp = file_get_contents("remote file");
$fp = fopen('test.txt', 'w');
fwrite($fp, $temp);
fclose($fp);

It works correctly, except that the file_get_contents takes 30+ seconds to complete, and the entire server is frozen until it's done. This is repeated about 70 times. Why won't it run as a background process???!

Ideally I want to use curl_multi to process them all at the same time, but I have to have it run in the background, and am just plain failing to do it properly.

Michael Mior is right, you need to redirect the output somewhere. If you don't care about where it goes, something like this should do:

exec("curl -d $params $url > /dev/null 2>&1");

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM