[英]A set of cluster python workers doing scheduled tasks
I would like to build a set of cluster workers (ie droplets in digitalocean or so). 我想建立一组集群工作者 (例如,数字海洋中的飞沫)。
Each worker would be doing a periodic task and sending the results periodically to the main application. 每个工作人员将执行一个定期任务,并将结果定期发送到主应用程序。
Here is a pseudocode to demontrate the functionality: 这是伪装的功能的伪代码:
Worker code 工人代码
while True:
resultFromLocationXY = calculate_my_local_task()
send_task_to_the_main_application(resultFromLocationXY)
time.sleep(5)
Main appplication code 主要应用代码
In the main application I would like to asynchronously evaluate worker results: 在主应用程序中,我想异步评估工作程序结果:
while True:
resultFromLocationXY = listen_to_results_from_location('xy') # not blocking, if new results are available, update the results variable
process_the_results([resultFromLocationXY, resultFromLocationXX, ...])
time.sleep(5)
I have been using the ipython ipcluster solution. 我一直在使用ipython ipcluster解决方案。 I was able to create a remote worker and to execute a task by using an apply_async function AND arrange it all in a non blocking way.
我能够创建一个远程工作者并通过使用apply_async函数执行任务,并以非阻塞方式安排所有任务。
BUT: I was not able to make periodic tasks, kind of streaming tasks. 但是:我无法执行定期任务,例如流任务。 Moreover, I would like to have more nodes in one location and stream to the same variable into the main application.
此外,我想在一个位置拥有更多节点,并将相同的变量流式传输到主应用程序中。
I would prefer a non ipython solution. 我更喜欢非ipython解决方案。
To me one of the best solution could be something like 对我来说,最好的解决方案之一可能是
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.