简体   繁体   English

一组执行计划任务的集群python工人

[英]A set of cluster python workers doing scheduled tasks

I would like to build a set of cluster workers (ie droplets in digitalocean or so). 我想建立一组集群工作者 (例如,数字海洋中的飞沫)。

Each worker would be doing a periodic task and sending the results periodically to the main application. 每个工作人员将执行一个定期任务,并将结果定期发送到主应用程序。

Here is a pseudocode to demontrate the functionality: 这是伪装的功能的伪代码:

Worker code 工人代码

while True:
    resultFromLocationXY = calculate_my_local_task()
    send_task_to_the_main_application(resultFromLocationXY)
    time.sleep(5)

Main appplication code 主要应用代码

In the main application I would like to asynchronously evaluate worker results: 在主应用程序中,我想异步评估工作程序结果:

while True:
    resultFromLocationXY = listen_to_results_from_location('xy') # not blocking, if new results are available, update the results variable

    process_the_results([resultFromLocationXY, resultFromLocationXX, ...])
    time.sleep(5)

I have been using the ipython ipcluster solution. 我一直在使用ipython ipcluster解决方案。 I was able to create a remote worker and to execute a task by using an apply_async function AND arrange it all in a non blocking way. 我能够创建一个远程工作者并通过使用apply_async函数执行任务,并以非阻塞方式安排所有任务。

BUT: I was not able to make periodic tasks, kind of streaming tasks. 但是:我无法执行定期任务,例如流任务。 Moreover, I would like to have more nodes in one location and stream to the same variable into the main application. 此外,我想在一个位置拥有更多节点,并将相同的变量流式传输到主应用程序中。

I would prefer a non ipython solution. 我更喜欢非ipython解决方案。

To me one of the best solution could be something like 对我来说,最好的解决方案之一可能是

  • Create a python daemon that will listen for workers. 创建一个python守护程序来监听工作程序。 Once a worker send some data, the main daemon will create a thread and start again to listen for other workers 工作人员发送了一些数据后,主守护程序将创建一个线程,然后再次开始侦听其他工作人员
  • Create the workers and use cron utility. 创建worker并使用cron实用程序。 In that way you could create workers on the fly and within a certain interval. 这样,您可以在一定间隔内动态创建工作程序。 Workers will call the main daemon. 工作人员将调用主守护程序。 Once the main daemon respond with an "OK status" the worker could start 一旦主守护程序以“ OK状态”响应,工作人员就可以启动

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何限制在特定队列中执行任务的工人? [Celery] [Python] - How to limit workers doing tasks in a specific queue? [Celery][Python] Python asyncio:任务是如何安排的? - Python asyncio: how are tasks scheduled? Google App Engine:使用Cron for Python的预定任务 - Google App Engine: Scheduled Tasks With Cron for Python Python3.7 和计划任务的 Unicode 问题 - Unicode issue with Python3.7 and Scheduled Tasks 使用python子进程创建计划任务 - creating scheduled tasks using python subprocess Python 中的线程(和异步任务)如何调度? - How are threads (and asyncio tasks) scheduled in Python? 将python列表临时存储在数据库中以执行计划的任务? - Storing python lists temporarily in a database for scheduled tasks? 使用芹菜计划任务的两个应用程序:一名工作人员中的“接收到的未注册任务”错误 - Two applications using celery scheduled tasks: “Received unregistered task” errors in one of the workers Python:多进程工作程序,跟踪已完成的任务(缺少完成) - Python: multiprocess workers, tracking tasks completed (missing completions) 使用IDLE(python GUI)Windows Server 2012运行的计划任务python - Scheduled tasks python running with IDLE(python GUI) Windows server 2012
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM