简体   繁体   中英

Getting 'too many open files' error when inserting documents in mongodb

I am creating a largish (80k collections, 200k documents per collection) database in Mongo. When I start pushing documents into the database, I keep getting the following error after ~800k inserts.

pymongo.errors.WriteError: 24: Too many open files

I have a decent sized server (2 vCPU, 32 GB RAM). ulimit for file open is set to unlimited. And limit nofile 999999 999999 is set in /etc/init/mongodb.conf.

I have tried inserting sleep (with the hope that mongodb will close files after a bit) in the file insertion script but that hasn't helped either. The only thing that works is restarting Mongodb after the insertion has failed. Then the process can resume.

How can I make insertion process better without having to pause it after every few thousand inserts and restarting mongodb?


Sharing the Python script that is transferring datapoint from redis to mongo

while (redis_conn.llen(redis_key) > 0) :
            redis_json = redis_conn.lpop(redis_key)
            json_doc = json.loads(redis_point)
            collection_name = json_doc['collection_name']
            collection = mongodb[collection_name]
            document = { '1' : json_doc['1'], '2' : json_doc['2'] }
            mongo_write_response = collection.insert_one(document)

(For brevity I have simplified the document. The actual document has around 20 data points.

The problem seems to be with the installation of MongoDB. I was using Debian 9 and used apt-get to install the version of MongoDB that ships bundled with the OS.

Since MongoDB documentation said it does not support Debian 9 yet, downgraded the OS to Debian 8 and installed MongoDB community edition as per the documentation . It is working well now :)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM