简体   繁体   中英

Python Script On Webserver

I have python script which downloads N number of images from website. I run it on webserver ubuntu 10.04. For example download.py downloads 10000 images from website, prints to file about occured errors. After downloading N images it exits. Usually on local machine I run it like:

sudo python download.py

How can I run it on webserver to be always running and when it finished it should stop? I run it manually when I need(cron jobs is not necessary). for loop in script:

for i in range(1, N):
   #do download

If script is stopped by some errors, I will need to run again from beginning, while I do not save any data to run from stop place.

On your server, you can use an Event loop provided by Twisted and make it run at regular intervals.

from twisted.internet import task
from twisted.internet import reactor

timeout = 60.0 

def downloadlinks():
    # Have a logic to start download.
    # One previous download is over.
    if num_images_downloaded < 1000:
       pass  # previous download still in progress
    else:
       start_download()


    #do work here
    pass

l = task.LoopingCall(doWork)
l.start(timeout) # call every sixty seconds

reactor.run()

您可以创建将运行脚本的python守护程序服务

使用诸如Supervisor之类的流程管理器。

在我的情况下,我可以像这样运行它: sudo nohup python download.py

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM