简体   繁体   中英

how to read data from mongodb block by block and write to posrgres

I have a mongo db running in production. I want to move the data from MongoDB to Postgres for some migration requirement.

Now comes to data part, I am planning to write one utility which will read data from MongoDB and write to Postgres.

Here I want to read all the data from mongo db (contains 240335 rows) to Postgres.

I can not read entire data into memory in the application. I want to read in batch of 10000 then write do some modification and write those to Postgres and then again read next 10000 again repeat this.

How can I do this?

I never did it, but I think, you can use cursors to upload records severally. Problem - the solution will be inefficient.

Example

var myCursor = db.bios.find( );
var myDocument = myCursor.hasNext() ? myCursor.next() : null;

if (myDocument) {
    var myName = myDocument.name;
    print (tojson(myName));
    //put record to db or add to batch, and upload if 1000 in collection
}

Maybe you can use the streams? I do not know if streams can be used in MongoDB.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM