I have this line at the end of my bash script:
wget -q $URL -Y off
where URL
is the path to a LIVE txt file that contains the console output for my bash script.
Clarification on LIVE txt file :
This txt file keeps updating along the process since it stores the console output. My shell script contains some "heavy" programs so the console output is very lengthy and the txt file will need some time to be generated (probably 3~5 seconds delay).
Problem : sometimes I get incomplete txt file which only contains small portion of the final txt file. (it is expected to get everything before the wget
command) And interesting enough, most of the incomplete txt files I downloaded is stuck at line 9999. I don't know whether this is limitation or not, but it appears to be fine if the console output is below 5k lines.
Question: Is there a way to make wget
wait for a specific time?
ps I know the there is a timeout option -T
for wget
and the default is 900 seconds. But my problem is not related to the slow connection but the delay of generating the file.
If this wget
statement is running as part of the script that is generating the output, you are probably running into some sort of output buffering issue...that is, since the script has not yet exited, output has not yet been flushed.
How are you redirecting output? One way would be to wrap everything other than the wget
statement in a subshell and redirect the output of that :
#!/bin/sh
(
a bunch
of commands
) > output 2>&1
wget -q $URL ...
This ensures that the other commands have completed and the containing shell has exited by the time the wget
runs.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.