简体   繁体   English

同时运行的 cron 作业记录处理组合

[英]Simultaneous running cron jobs records processing mix

I'm struggling with multiple same time lunched cronjobes same record process issue我正在为多个同时吃午饭的 cronjobes 相同的记录过程问题而苦苦挣扎

Intro介绍

  • I have db table where is present 100+ customers我有 100 多个客户的数据库表
  • For each customer i have to run some script via cronjobe per each 5 minutes对于每个客户,我必须每 5 分钟通过 cronjobe 运行一些脚本
  • Cronjobe takes each customer from db immediately update last used timestamp so that next cron will select 5 minutes older processed next record Cronjobe 从 db 中获取每个客户立即更新上次使用的时间戳,以便下一个 cron 将选择 5 分钟前处理的下一条记录
  • Cronjobe processing takes approx 30second Cronjobe 处理大约需要 30 秒
  • So it mean to process all 100 customers during required 5 minutes i will need to have approx 10 simulations cron jobes wokring因此,这意味着在所需的 5 分钟内处理所有 100 个客户,我将需要进行大约 10 次模拟 cron jobes wokring
  • all cron's defined as: * * * * * php cron.php (so each 1 minute lunched 10 cronjobes)所有 cron 的定义为:* * * * * php cron.php(所以每 1 分钟午餐 10 个 cronjobes)
  • The cron dose some network lookup to that customer ip and log it to log database it should happend for each customer during each 5 minutes as later i will draw chart based on logs for each customer cron 对该客户 ip 进行一些网络查找,并将其记录到日志数据库中,它应该在每 5 分钟内为每个客户发生一次,稍后我将根据每个客户的日志绘制图表
  • code is written in PHP, DB is MySql代码是PHP写的,DB是MySql

The problem is when that 10 cronjobes starts at the same time randomly happens that 2 cronjobes selected same customer from database to process问题是当 10 个 cronjobes 同时启动时随机发生 2 个 cronjobes 从数据库中选择相同的客户进行处理

So it mean 2 crons are start at alsmost same time (micro seconds diffarance) both are select at same time last unprocessed row (eg id:17) from database then update same id17 in database 3-dth lunched cronjobe already took id:18.因此,这意味着 2 个 cron 几乎在同一时间开始(微秒差异),它们都是在同一时间从数据库中选择最后一个未处理的行(例如 id:17),然后在数据库中更新相同的 id17 第三次午餐 cronjobe 已经采用了 id:18。 But i need that each coronjobe will take unique next record from database not same但我需要每个 coronjobe 都会从不同的数据库中获取唯一的下一条记录

As a workaround i tried to add random sleep(rand(1,10)) delay at beginning of cron.php but don't help mutch, random duplication still happens cause cron continuously selects last unprocessed next customer which sometime matches with another cornjobe next customer select at same time

Is there any solutions present for this situation ???有没有针对这种情况的解决方案???

The solution you seek is collaborative locking .您寻求的解决方案是协作锁定 The customer must be marked "locked" by some job, and no script must choose a customer that is locked by another script.客户必须被某个作业标记为“锁定”,并且没有脚本必须选择被另一个脚本锁定的客户。

Also, you must do this in such a way that no two jobs choose the same customer to acquire.此外,您必须以这样一种方式做到这一点,即没有两个工作选择同一个客户来获取。

In MySQL you can do:在 MySQL 中,您可以执行以下操作:

$me = getmypid();
$conn->execute("SELECT GET_LOCK('choosing', 5) AS okay");
// Check the returned value of okay. If it is not 1, exit() immediately.
// choose some customer in some way. The smallest Id with OwnedByPid=0 for example. The query should be fast enough to run in under 5 seconds.
$conn->execute("UPDATE customers SET OwnedByPid={$me} WHERE id={$custId};");
$conn->execute("SELECT RELEASE_LOCK('choosing'");
//

// Do your work

$conn->execute("SELECT GET_LOCK('choosing', 5)");
$conn->execute("UPDATE customers SET OwnedByPid=0 WHERE OwnedByPid={$me};");
$conn->execute("SELECT RELEASE_LOCK('choosing')");

Then, periodically - when no scripts are running - release the customers that might be marked by a script that crashed:然后,定期 - 当没有脚本运行时 - 释放可能被崩溃的脚本标记的客户:

$conn->execute("UPDATE customers SET OwnedByPid=0;");

Or you can add another column, OwningStart, set it to NOW() when you take ownership, so you can check when OwningStart is older than 30 seconds and clear it.或者您可以添加另一列,OwningStart,在您取得所有权时将其设置为 NOW(),这样您就可以检查 OwningStart 何时超过 30 秒并清除它。 Or mark it as free:或将其标记为免费:

SELECT MIN(Id) FROM customers WHERE OwnedByPid=0 OR OwningStart < NOW() - INTERVAL 2 MINUTE;

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM