简体   繁体   中英

multiprocessing in python

My question might be quite vague. I am looking for some reasonable approach to perform my task. I have developed a webpage where the user uploads a file used as an input file for my code developed in python.

When the input file is submitted through the webpage, it is saved in a temporary folder from where the daemon copies it to another location. What I want is to looks for the files in the folder regularly (can write a daemon) If it finds more than one file it runs the code as separate jobs with the input files found in the directory limiting to a max of 5 processes running at the same time and when one process finishes it starts the next if there are files in the folder (in a chronological order).

I am aware of multiprocessing in python but don't know how to implement to achieve what I want or should I go for something like XGrid to mange my jobs. The code usually takes few hours to few day to finish one job. But the jobs are independent of each other.

I use a SQL table to perform such a thing, because my users may launch dozens of tasks alltogether if I do not limit them.

When a new file shows up, a daemon writes its name in the table (with allsort of other information such as size, date, time, user, ...)

Another daemon is then reading the table, getting the first not-executed task, doing it, and marking it as executed. When nothing is found to be done, it just waits for another minute or so.

This table is also the log of jobs performed and may carry results too. And you can get averages from it.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM