简体   繁体   中英

shell script wget not working when used as a cron job

i have a function in a php web app that needs to get periodically called by a cron job. originally i just did a simple wget to the url to call the function and everything worked fine, but ever since we added user auth i am having trouble getting it to work.

if i manually execute these commands i can login, get the cookie and then access the correct url:

site=http://some.site/login/in/here
cookie=`wget --post-data 'username=testuser&password=testpassword' $site -q -S -O /dev/null 2>&1 | awk '/Set-Cookie/{print $2}' | awk 'NR==2{print}'`
wget -O /dev/null --header="Cookie: $cookie" http://some.site/call/this/function

but when executed as a script, either manually or through cron, it doesn't work.

i am new to shell scripting, any help would be appreciated

this is being run on ubuntu server 10.04

In theory, the only difference from manually executing these and using a script would be the timing.

Try inserting a sleep 5 or so before the last command. Maybe the http server does some internal communication and that takes a while. Hard to say, because you didn't post the error you get.

OK simple things first -

  1. I assume the file begins with #!/bin/bash or something
  2. You have chmod ded the file +x
  3. You're using unix 0x0d line endings

And you're not expecting to return any of the variables to the calling shell, I presume?

Failing this try tee ing the output of each command to a log file.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM