简体   繁体   English

设置队列以一次运行一个作业

[英]set a queue to run jobs one at a time

in my application i have a route that get a big array of objects and chunk it to smaller parts (for example 250 rows in each part) and push every part to the queue.在我的应用程序中,我有一条路线可以获取大量对象并将其分块到更小的部分(例如,每个部分 250 行)并将每个部分推送到队列中。 but this data must process row by row and each job must run only after the previous one is done.但是此数据必须逐行处理,并且每个作业必须仅在前一个作业完成后运行。 because every row depend on some of rows before itself因为每一行都依赖于它之前的一些行

my queue driver is redis and horizon and it execute jobs very fast.我的队列驱动程序是 redis 和地平线,它执行作业的速度非常快。 I tried to use later method and set a dynamic delay.我尝试使用后面的方法并设置动态延迟。 but it doesnt solve my problem.但它不能解决我的问题。 because the client might cant handle big data and splits its data (paginate it) and send them to my api. and so the data in each request must add to the end of the queue.因为客户端可能无法处理大数据并将其数据拆分(对其进行分页)并将它们发送到我的 api。因此每个请求中的数据必须添加到队列的末尾。

for now i add each part three time to the queue like below.现在我将每个部分三次添加到队列中,如下所示。 but im looking for a better way但我正在寻找更好的方法

foreach ($dataParts as $i => $dataPart) {
    Queue::later((($i * 5)), new ImportData($dataPart));
    Queue::later((($i * 60) ), new ImportData($dataPart));
    Queue::later((($i * 120)), new ImportData($dataPart));
}

I tried job batching but still didn't solve my problem.我尝试了作业批处理,但仍然没有解决我的问题。 then tried withoutOverlaping method and that didn't work too.然后尝试了 withoutOverlaping 方法,但也没有用。

finally i added a new worker and queue in horizon.php file and set maxProcess => 1 for it.最后,我在 horizon.php 文件中添加了一个新的 worker 和队列,并为其设置了maxProcess => 1 and push my jobs to that.并将我的工作推向那个。 and worked as i wanted并按照我的意愿工作

horizon.php:地平线.php:

...
...
...

'defaults' => [
        // the default queue
        'supervisor-1' => [
            'connection' => 'redis',
            'queue' => ['default'],
            'balance' => 'auto',
            'maxProcesses' => 10,
            'memory' => 128,
            'tries' => 1,
            'nice' => 0,
        ],

         // new queue process only one job at a time
        'supervisor-single' => [
            'connection' => 'redis',
            'queue' => ['single-queue'],
            'balance' => 'auto',
            'maxProcesses' => 1,
            'memory' => 128,
            'tries' => 1,
            'nice' => 0,
        ],
    ],

in controller:在 controller 中:

foreach ($dataParts as $dataPart) {
    Queue::push(new ImportData($dataPart), null, 'single-queue');
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM