简体   繁体   English

如何在 Node.JS 中使用 MongoDB 批量插入插入大型数组?

[英]How can I insert a large array using MongoDB bulk insert in Node.JS?

I am working on a project which is related to my master's degree.我正在做一个与我的硕士学位相关的项目。 The project includes MongoDB and Node.JS.该项目包括 MongoDB 和 Node.JS。 In the project, the previous programmer used insertMany() function to insert multiple records to the database.在项目中,前面的程序员使用insertMany() function 向数据库中插入多条记录。 But now the data got larger and this method gave me the error below.但是现在数据变大了,这种方法给了我下面的错误。

I changed this method with initializeUnorderedBulkOp() function as below.我用initializeUnorderedBulkOp() function 改变了这个方法,如下所示。 It gave me Unhandled rejection MongoError: E11000 duplicate key error index error.它给了我Unhandled rejection MongoError: E11000 duplicate key error index错误。 Although this error, the function inserts all records to the database.虽然这个错误,function 将所有记录插入数据库。 But, since it gave an error to the application, the application stop to step forward in the backend, so my program cancels.但是,由于它给应用程序一个错误,应用程序停止在后端前进,所以我的程序取消了。 How can I need to make an improvement?我需要如何改进?

var poolSchema = mongoose.Schema({
    "topic_id": {
        type: Number,
        default: null,
        required: true
    },
    "document_id": {
        type: String,
        default: null,
        required: true
    },
    "index": {
        type: Number,
        default: null
    },
    "score": {
        type: mongoose.Schema.Types.Decimal128,
        default: null
    },
    "search_engine_id": {
        type: String,
        default: null,
        required: false
    },
    "is_assigned": {
        type: Boolean,
        default: false,
        required: false
    },
    "project":{
        type : String,
        trim : true,
        default : null,
        required :true
    },
    "unique_id":{
        type: String,
        default: null,
        unique : true
    },
    "docno_projectid":{
        type: String
    },
    "createddate":{
        type:Date,
        default : Date.now
    }
}, { collection: "pools" });

module.exports.createPoolItems = function(poolItems, callback) { 


            populateUniqueId(poolItems);
            Pools.collection.insertMany(poolItems,{ordered:false},callback);      

}

function populateUniqueId (poolItems){
    if(poolItems.length > 0){
        poolItems.forEach(element => {
            element.unique_id = element.project+'_'+element.topic_id+'_'+element.document_id
        });
    }
}

Error:错误:

--- Last few GCs --->

[15968:000002C335C32F20]    86903 ms: Mark-sweep 1398.1 (1427.9) -> 1396.8 (1424.9) MB, 689.3 / 0.0 ms  (+ 0.0 ms in 25 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 694 ms) (average mu = 0.075, current mu = 0.007) all[15968:000002C335C32F20] 
   87628 ms: Mark-sweep 1397.4 (1424.9) -> 1396.9 (1425.4) MB, 722.7 / 0.0 ms  (average mu = 0.039, current mu = 0.003) allocation failure scavenge might not succeed


<--- JS stacktrace --->

==== JS stack trace =========================================

    0: ExitFrame [pc: 000002132645C5C1]
Security context: 0x01f1eef9e6e9 <JSObject>
    1: /* anonymous */ [0000014965ACD3F9] [C:\TopicBinder2-master\node_modules\mongoose\lib\document.js:~878] [pc=00000213267DC691](this=0x03783882f629 <model map = 0000005679E8B731>,pathToMark=0x02ac0065e369 <String[8]: topic_id>,path=0x02ac0065e369 <String[8]: topic_id>,constructing=0x030d1ba828c9 <true>,parts=0x0378388302a1 <JSArray[1]>,schema=0x01b059d...

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
 1: 00007FF7C206832A v8::internal::GCIdleTimeHandler::GCIdleTimeHandler+4506
 2: 00007FF7C2042DB6 node::MakeCallback+4534
 3: 00007FF7C2043730 node_module_register+2032
 4: 00007FF7C235C14E v8::internal::FatalProcessOutOfMemory+846
 5: 00007FF7C235C07F v8::internal::FatalProcessOutOfMemory+639
 6: 00007FF7C2542874 v8::internal::Heap::MaxHeapGrowingFactor+9620
 7: 00007FF7C2539856 v8::internal::ScavengeJob::operator=+24550
 8: 00007FF7C2537EAC v8::internal::ScavengeJob::operator=+17980
 9: 00007FF7C2540BF7 v8::internal::Heap::MaxHeapGrowingFactor+2327
10: 00007FF7C2540C76 v8::internal::Heap::MaxHeapGrowingFactor+2454
11: 00007FF7C266AF17 v8::internal::Factory::NewFillerObject+55
12: 00007FF7C26E8106 v8::internal::operator<<+73494
13: 000002132645C5C1

When I use initializeUnorderedBulkOp() :当我使用initializeUnorderedBulkOp()时:

module.exports.createPoolItems = function(poolItems, callback) {

        let bulk = Pools.collection.initializeUnorderedBulkOp();
        populateUniqueId(poolItems);

        for (var i=0; i<poolItems.length;i++){
            bulk.insert(poolItems[i]);
        }

        bulk.execute();      

        //bulk.execute({w:0,j:false}); 

}

Error:错误:

Unhandled rejection MongoError: E11000 duplicate key error index: heroku_bwcmzm1p.pools.$unique_id dup key: { : "tralkc", : 501, : "96284" }    at Function.MongoError.create

I see no error handling in your code.我在您的代码中没有看到错误处理。 Try running the function asynchronously and catching the exception.you say the data still makes it to your database, so handling the exception in a try block for example using asyn/await should stop your server crash尝试异步运行 function 并捕获异常。您说数据仍会进入数据库,因此在 try 块中处理异常,例如使用 asyn/await 应该可以阻止服务器崩溃

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM