简体   繁体   中英

Database access for concurrent API requests

How can one ensure that the database calls are asynchronous for concurrent API calls?

For example, I have this model that represents a pool.

{
    limit: number e.g 5
    isOpen: boolean // will close when users.length == limit
    users: []
}

I want that at any given time, there's only 1 non-full pool in the DB. So I have something like this that runs on an API call.

let openPool = model.findOne({isOpen: true});
if(!openPool){

    //create a new pool
    model.insert({
        limit: 5,
        users: [currentuser]
        isFull: false
    });
}else{

    //add the current user to existing open pool
    model.update({
        users: {$push: {currentuser}}
    });

    //close pool if full
    if(model.users.length == model.users.limit){
        model.update({
            isOpen: {$set: false}
        });
    }

}

When I send 10 requests at the same time to this endpoint, each of them thinks they are the 1st and then create a new pool. So I end up with new 10 pools where I expected 2 full (closed) pools of 5 users each.

Is there a way to ensure that all these requests execute sequentially?

Maybe sequentially is a bad idea. But overall, I'm looking for a way to keep the state of the DB always be valid.

Is Nodejs good for this?

Any pointers to the right direction will be much appreciated.

The solution was to use a mutex to synchronize the critical block.
I used async-lock

var AsyncLock = require('async-lock');
var lock = new AsyncLock();

requestHandler(){
    return new Promise(async (resolve,reject) => {
        lock.acquire('key', async () => {
           let openPool = await model.findOne({isOpen: true});
           if(!openPool){

           //create a new pool
           await model.insert({
               limit: 5,
               users: [currentuser]
               isFull: false
           });
        }else{
            //add the current user to existing open pool
            await model.update({
                users: {$push: {currentuser}}
            });

            //close pool if full
            if(model.users.length == model.users.limit){
                model.update({
                    isOpen: {$set: false}
                });
            }
        }

        return resolve(true);
    }).then(() => {
        //lock released
    });
}          

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM