简体   繁体   中英

Keeping Mongoid objects in memory after iterating over them with cursor

I'm iterating over objects using the same cursor a few times, so I'm assuming that keeping the objects in memory would be faster.

I tried to put all the objects in an array before I used them with objects = cursor.to_a , but the the call blocks any further calculations while it waits for the data to download and is ultimately slower.

Another way I thought of is to append the documents in an array as I'm doing the calculations, then use the array to do further calculations. Though this is pretty unclean and hard to maintain.

Is there any code out there that already does this?

So there's already a method that does this in Mongoid. It caches per-query and per iteration, so it really only keeps the data in memory when you query for it, so it doesn't block.

It's just a function on the cursor and you can call it like this:

Model.where(:name => "John").cache

More info here: http://mongoid.org/docs/extras.html

Try to use identity_map , more details you can find in the doc: http://mongoid.org/docs/installation/configuration.html

identity_map_enabled (false): When set to true Mongoid will store documents loaded from the database in the identity map by their ids, so subsequent dataabase queries for the same document in the same unit of work do not hit the database. This is only for relation queries at the moment. See the identity map documentation for more info.

Another alternative: Store only the id part in the array. Mongo::Cursor is not a hash in itself but something similar to a pointer of the result set as far as I can understand. More on that here: Mongo Docs

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM