简体   繁体   English

Python-同时运行代码

[英]Python - Run Code Simultaneously

I have an endpoint which get's some data then runs some code that takes about 30 seconds then gives back a response specific to the data. 我有一个端点,它获取一些数据,然后运行一些代码,耗时约30秒,然后返回特定于该数据的响应。 I need to be able to hit the endpoints multiple times with different data within 30 seconds but the code still needs to run and give back the correct data specific result. 我需要能够在30秒内用不同的数据多次击中端点,但是代码仍然需要运行并返回正确的数据特定结果。

Here's what I mean: 这就是我的意思:

class Foo(Controller):
    def POST(self, **kwargs):
        [Run Code That Takes 30 Seconds]
        Return [Result That Changes Bassed off POST request Sent]

When I run this right now and I hit the endpoint more than once in 30 seconds the code just restarts with the new data and completely ignore the old data and it's results. 当我现在运行此命令,并且在30秒内多次击中端点时,代码仅使用新数据重新启动,而完全忽略了旧数据及其结果。

How can I allow the endpoint to be hit more than once in the seconds but still give back the corresponding result? 如何在几秒钟内多次击中端点,但仍返回相应的结果? Happy to answer any questions! 很高兴回答任何问题!

You should first profile your script in order to see if your task is cpu or io bound. 您应该首先分析脚本,以查看您的任务是cpu还是io绑定。

If your task is io bound : 如果您的任务是io绑定的

Have a look at the asyncio library python 3 . 看看asyncio库python 3

Or at the thread library python 2 + python 3 . 或者在线程库python 2 + python 3处

If your task is cpu bound 如果您的任务受CPU约束

If your task is cpu bound, you can't use threads because of the GIL . 如果您的任务是cpu绑定的,那么由于GIL ,您将无法使用线程。

You will have the choice to: 您可以选择:

  • use the multiprocessing library: python2 + python3 . 使用多重处理库: python2 + python3

  • run an instance of python per cpu core, each one running a different task. 每个cpu核心运行一个python实例,每个实例运行一个不同的任务。

  • use a task queue like Celery . 使用像Celery这样的任务队列。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM