简体   繁体   English

使用python Bottle框架启用我的Web应用程序的多线程

[英]Enable multithreading of my web app using python Bottle framework

I have a web app written with Bottle framework. 我有一个用Bottle框架编写的Web应用程序。 It have a global somedict list accessed by multiple HTTP query. 它具有一个由多个HTTP查询访问的全局脚本列表。

After some researching, I find that the Bottle framework only support 1 thread in 1 process mode to run my app(I don't believe it is true, perhaps migrating it to other frameworks like Flask is a good idea.). 经过一番研究,我发现Bottle框架在1个进程模式下仅支持1个线程来运行我的应用程序(我不相信这是真的,也许将其迁移到Flask等其他框架是个好主意。)

1 To enable multi-threading, I find WSGI solution but it does not support multiple processs(1 threads for each process) accessing global variable like somedict in my app, because process will re-init the list every time a query gets handled. 1为了启用多线程,我找到了WSGI解决方案,但是它不支持多个进程(每个进程1个线程)像我的应用程序中的somedict一样访问全局变量,因为每次处理查询时,进程都会重新初始化列表。 How can I handle this issue? 我该如何处理?

2 Is there any other solutions except WSGI that solve the problem to enable this app to serve multiple HTTP query at once? 2除了WSGI之外,还有其他解决方案可以解决该问题,以使该应用程序一次服务多个HTTP查询吗?

from bottle import request, route
import threading

somedict = {}
somedict_lock = threading.Lock()

@route("/read")
def read():
    with somedict_lock:
        return somedict

@route("/write", method="POST")
def write():
    with somedict_lock:
        somedict[request.forms.get("key1")] = request.forms.get("value1")
        somedict[request.forms.get("key2")] = request.forms.get("value2")

It's best to serve a WSGI app via a server like gunicorn or waitress, which will handle your concurrency needs, but almost no matter what you do for concurrency your global queue in memory will not work the way you want it to. 最好通过gunicorn或女服务员之类的服务器来提供WSGI应用程序,这将满足您的并发需求,但是几乎无论您做什么并发,内存中的全局队列都无法按您希望的方式工作。 You need to use an external memory store like memcached, redis, etc. Static data is one thing, but mutable state should never be shared between web app processes. 您需要使用外部存储器,例如memcached,redis等。静态数据是一回事,但是可变状态永远不应在Web应用程序进程之间共享。 That's contrary to Python web server idioms and the typical execution model of Python web apps. 这与Python Web服务器习惯用法和Python Web应用程序的典型执行模型相反。

I'm not saying it's literally impossible to do in Python, but it's not the way Python solves this problem. 我并不是说在Python中实际上是不可能的,但这不是Python解决此问题的方式。

You can process incoming requests asynchronously, currently Celery seems very suitable for running asynchronous tasks. 您可以异步处理传入的请求,当前Celery似乎非常适合运行异步任务。 Read how Celery can do this. 阅读芹菜如何做到这一点。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM