简体   繁体   English

MongoDB PHP findAndModify多种性能

[英]MongoDB PHP findAndModify Multiple Performance

I have a documents in a collection called Reports that are to be processed. 我在名为Reports的集合中有一个要处理的文档。 I do a query like 我做类似的查询

$collectionReports->find(array('processed' => 0)) 

(anywhere between 50 and 2000 items). (介于50到2000个项目之间)。 I process them how I need to and insert the results into another collection, but I need to update the original Report to set processed to the current system time. 我按照需要处理它们并将结果插入到另一个集合中,但是我需要更新原始报告以将处理设置为当前系统时间。 Right now it looks something like: 现在看起来像:

$reports = $collectionReports->find(array('processed' => 0));
$toUpdate = array();
foreach ($reports as $report) {
    //Perform the operations on them now
    $toUpdate = $report['_id'];
}
foreach ($toUpdate as $reportID) {
    $criteria = array('_id' => new MongoId($reportID));
    $data = array('$set' => array('processed' => round(microtime(true)*1000)));
    $collectionReports->findAndModify($criteria, $data);
}

My problem with this is that it is horribly inefficient. 我的问题是效率很低。 Processing the reports and inserting them into the collection takes maybe 700ms for 2000 reports, but just updating the processed times takes at least 1500ms for those same 2000 reports. 处理这些报告并将其插入到集合中可能需要700毫秒才能处理2000个报告,但是对于相同的2000个报告,仅更新处理时间至少需要1500毫秒。 Any tips to speed this up? 有什么提示可以加快速度吗? Thanks in advance. 提前致谢。

EDIT: The processed time doesn't have to be exact, it can just be the time the script is ran (+/- 10 seconds or so), if it would be possible to take the object ($report) and update the time directly like that, it would be better than just searching after the first foreach. 编辑:处理时间不必是确切的,它可以是脚本运行的时间(+/- 10秒左右),如果可以的话($ report)并更新时间就像这样直接比在第一个foreach之后进行搜索要好。

感谢Sammaye,从findAndModify()更改为update()似乎更好,更快。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM