简体   繁体   English

在一台机器上共享芹菜工人之间的记忆区域

[英]Share memory areas between celery workers on one machine

I want to share small pieces of informations between my worker nodes (for example cached authorization tokens, statistics, ...) in celery. 我想在芹菜中分享我的工作节点之间的小块信息(例如缓存的授权令牌,统计信息......)。

If I create a global inside my tasks-file it's unique per worker (My workers are processes and have a life-time of 1 task/execution). 如果我在我的任务文件中创建一个全局文件,那么每个工作者都是唯一的(我的工作者是进程并且具有1个任务/执行的生命周期)。

What is the best practice? 什么是最佳做法? Should I save the state externally (DB), create an old-fashioned shared memory (could be difficult because of the different pool implementations in celery)? 我应该在外部保存状态(DB),创建一个老式的共享内存(由于芹菜中不同的池实现,可能很难)?

Thanks in advance! 提前致谢!

I finally found a decent solution - core python multiprocessing-Manager: 我终于找到了一个不错的解决方案 - 核心python多处理 - 管理器:

from multiprocessing import Manager
manag = Manager()
serviceLock = manag.Lock()
serviceStatusDict = manag.dict()

This dict can be accessed from every process, it's synchronized, but you have to use a lock when accessing it concurrently (like in every other shared memory implementation). 这个dict可以从每个进程访问,它是同步的,但你必须在同时访问它时使用一个锁(就像在每个其他共享内存实现中一样)。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM