简体   繁体   中英

mongoimport performances degrading over time

I'm using mongoimport to import some json files into my MongoDB database. I have 5 files with about 2M documents each, and the collection have 4 regular index and 2 multikey ones.

When I start importing the first file I see ~500 documents inserted per second but the performances starting going down after a while. I'm now importing the 3rd file and I see a throughput of less than 50 documents per second. It seems like the import degrades with the collection size increasing. What's going on? How can I improve this?

While it may not be the reason in this case, indexes do have some overhead when it comes to writes because you are inserting into the index as well as inserting into the collection. There is an additional performance hit if you update a document that causes it to become larger than it's allotted size. In these cases all indexes that include this document would need to be updated.

With 6 indexes on the collection, there is 6 indexes to update for every document that is inserted. This will have some impact on the effective speed of mongoimport

You could test this by importing into an unindexed collection and creating the indexes after the fact.

You can see here for more information on write impact of indexes.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM