简体   繁体   English

如何在 Gunicorn 工作人员之间共享缓存?

[英]How can I share a cache between Gunicorn workers?

I am working on a small service using Gunicorn and Flask (Python 3.6).The pseudocode below shows roughly the behavior I want.我正在使用 Gunicorn 和 Flask(Python 3.6)开发一个小型服务。下面的伪代码大致显示了我想要的行为。 There are a lot of serialized foo objects, and I want to hold as many of these in memory as possible and delete them on a LRU basis.有很多序列化的foo对象,我想在内存中保存尽可能多的这些对象,并在 LRU 的基础上删除它们。

cache = Cache()

@app.route('/')
def foobar():
    name = request.args['name']
    foo = cache.get(name)
    if foo is None:
        foo = load_foo(name)
        cache.add(foo)

    return foo.bar()

The problem I am having is I do not know how to share this cache between Gunicorn workers.我遇到的问题是我不知道如何在 Gunicorn 工作人员之间共享这个缓存。 I'm working with limited memory and don't want to be holding duplicate objects.我的工作内存有限,不想持有重复的对象。 Certain objects will be used very often and some probably never, so I think it really makes sense to hold them in memory.某些对象会经常使用,有些可能永远不会使用,所以我认为将它们保存在内存中真的很有意义。

This is just something that will only be taking requests from another application (both running on the same server), I just wanted to keep this code separate.这只是从另一个应用程序(都在同一台服务器上运行)接收请求的东西,我只是想将此代码分开。 Am I going the completely wrong direction by even using Gunicorn in the first place?我什至一开始就使用 Gunicorn 是完全错误的方向吗?

I don't see anything wrong with using Gunicorn, but it's probably not necessary to think about scaling horizontally unless you are close to putting this into production.我认为使用 Gunicorn 没有任何问题,但可能没有必要考虑水平扩展,除非您即将将其投入生产。 Anyway, I'd recommend using a separate service as a cache, rather than having one in python memory.无论如何,我建议使用单独的服务作为缓存,而不是在 python 内存中使用。 That way, each worker can open a connection to the cache as needed.这样,每个工作人员都可以根据需要打开与缓存的连接。 Redis is a popular option, but you may have to do some data manipulation to store the data, eg store the data as a JSON string rather than a python object. Redis是一种流行的选择,但您可能需要进行一些数据操作来存储数据,例如将数据存储为 JSON 字符串而不是 Python 对象。 Redis can act as a LRU cache by configuring it: https://redis.io/topics/lru-cache通过配置Redis可以充当LRU缓存: https : //redis.io/topics/lru-cache

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM