I want to be able to retrieve the contents of a log file stored on a server in the form of:
http://[SOME SORT OF ADDRESS]/file.txt
and print it to a shell which can refresh using something like 'watch' to keep a real time track of this log from a shell rather than having to use a broweser. Is there any neat and simple way of doing this (possibly later wrapped up into a python script).
这将每0.1秒获取一次请求的URL并显示在控制台上
watch -n 0.1 wget -qO- http://google.com
Yes tail -f
works on Linux. However this will only work on local files. If the file is remote you will have to repeatedly fetch it to keep the updated (I think).
也许你可以尝试这样的事情:
watch "wget -N http://[SOME SORT OF ADDRESS]/file.txt &> /dev/null; cat file.txt"
while sleep 60; do
curl address
done
Will print out once a min
Adjust to your liking
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.