简体   繁体   中英

Multithreading Falcon in Python

I'm creating a REST API for an application using Falcon . When launching two or more requests to the API on different endpoints, there's no multi-threaded execution (One request has to be finished to execute the next one)

The problem is coming from a POST endpoint that executes a complex machine learning process (takes dozen of seconds to finish) and the whole API is blocked when the process is being executed, because it waits for the process to be completed to return some results.

I'm using wsgiref simple_server to serve the requests:

if __name__ == '__main__':
    httpd = simple_server.make_server('127.0.0.1', 8000, app)
    httpd.serve_forever()

Is there any way to make the execution parallel to serve multiple requests in the same time.

Probably the server is not running in multiprocess or multithreaded mode.

But even if it was, it is not a good idea to occupy the web server for long-running tasks. The long running tasks should be run by some other worker processes.

Take a look at Celery

zaher ideally you should use Celery as giorgosp mention but if it is mandatory to return result for API request then you can use Gunicorn

gunicorn --workers 3 -b localhost:8000 main:app --reload

Here, in above code I have mention 3 workers so at a time you can serve/process 3 requests.

Ideally no of workers can be

cpu_count * 2 + 1

You can use any port number you like, but make sure that it is above 1024 and it's not used by any other program.

The main:app option tells Gunicorn to invoke the application object app available in the file main.py.

Gunicorn provides an optional --reload switch that tells Gunicorn to detect any code changes on the fly. This way you can change your code without having to restart Gunicorn.

And if this approach is not suitable for your need than I think you should use Tornado instead of Falcon .

Let me know if any further clarification needed.

This can be easily achieved by coupling Falcon with Gunicorn . With Gunicorn, achieving multi-threading/multi-processing will be relatively easier without needing to implement Celery (Although, nothing is stopping one from implementing it. Celery is awesome!)

gunicorn -b localhost:8000 main:app --threads 3 --workers 3 --reload

The above command will sping up 3 workers with each worker having 3 threads. You as a developer can tweak the number of workers and threads required. I would strongly advise to understand difference between multithreading and multiprocessing before tweaking these settings.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM