简体   繁体   中英

Wrapping MongoDB calls within a Promise

I'm using Meteor (1.0.3) in general, but for one particular case I'm using a raw server side route to render a file -- so I'm outside a Meteor method.

I'm using node fs.writeFile/fs.readFile and exec commands to call out to Linux command-line utilities too.

My only point in brining this up is that the node calls are async of course. And so I'm opted to use the node Q library in order to manage async callbacks.

This all worked until I added a line to call out to the MongoDB database.

A call like so:

var record_name = Mongo_Collection_Name.findOne({_personId: userId}, {fields: {'_id': 0}});

Produces the following error:

[Error: Can't wait without a fiber]

The error only occurs when I wrap the function in a Promise.

For example, something like this will throw:

getRecordExample = function () {
  var deferred = Q.defer();
  var record_name = Mongo_Collection_Name.findOne({_personId: userId}, {fields: {'_id': 0}});

  // do something

  // if no error
  deferred.resolve(record_name);

  return deferred.promise;
}

If I use the Meteor Fibers library I don't get the error:

getRecordExample = function () {
  var deferred = Q.defer();
  Fiber = Npm.require('fibers');
  var record_name

  Fiber(function () {
    record_name = Mongo_Collection_Name.findOne({_personId: userId});
  }).run()
  // do something

  // if no error
  deferred.resolve(record_name);

  return deferred.promise;
}

but, the record_name variable is undefined outside the fiber, so I don't have a way to pass the variable outside of the Fiber scope as far as I can tell.

A More Precise Example

This is a little long, so you have to scroll down to see it all. I'm basically building a workflow here so there are processes and subprocesses.

// both/routes.js
Router.route('/get-route', function(req, res) {
  // get the userId then start the workflow below

  // using Promises here because these were firing concurrently
  Q(userId)
  .then(process_1)
  .then(process_2)
  .done();
}, { name: 'server-side-ir-route', where: 'server' }

// server.js
process_1 = function (userId) {
  sub_process_1(userId);

  sub_process_2(userId);

  return userId;
}

process_2 = function (userId) {
  sub_process_3(userId);

  sub_process_4(userId);

  return userId;
}

sub_process_1 = function (userId) {
  var result = get_record_1(userId);

  // do stuff with result

  // using Q library to call out to async fs.writeFile, return Promise
  fs_writeFile_promise(result)
  .catch(function (error) {
    console.log('error in sub_process_1_write', error);
  })
  .done(function () {
    console.log('done with sub_process_1');
  }

  return userId;
}.future() // <-- if no future() here, the exception is thrown.

sub_process_2 = function (userId) {
  var result = get_record_2(userId);

  // do stuff with result

  // using Q library to call out to async fs.writeFile, return Promise
  fs_writeFile_promise(result)
  .catch(function (error) {
    console.log('error in sub_process_1_write', error);
  })
  .done(function () {
    console.log('done with sub_process_1');
  }

  return userId;
}.future()

// async because of I/O operation (I think)
get_record_1 = function (userId) {
  var record_1 = Mongo_Collection_Name.findOne({'userId': userId});
  // do stuff
  return record_1;
}
get_record_2 = function (userId) {
  var record_2 = Mongo_Collection_Name.findOne({'userId': userId});
  // do stuff
  return record_2;
}

// async operation using Q library to return a Promise
fs_writeFile_promise = function (obj) {
  var deferred = Q.defer();
  fs.writeFile(obj.file, obj.datas, function (err, result) {
    if (err) deferred.reject(err);
    else deferred.resolve('write data completed');
  });
  return deferred.promise;
}

For now, lets assume that the process_2 function is exactly like process_1

Also, we should assume I have console.log('step_start') and console.log('step_end') in each function. This is what it would look like on the command line:

  • start processes
  • end processes
  • start processes 1
  • end processes 1
  • start processes 2
  • start sub processes 1
  • getting record 1
  • start sub processes 2
  • getting record 2
  • returning record 1
  • end sub processes 1
  • called writeData in sub process 1
  • returning record 2
  • called writeData in sub process 2
  • end processes 2
  • ending sub processes 1

The reason I had to place a Fiber (future) on the sub_process_1() function was because when I placed the function process_1() in the Q chain at the top I got the Error: Can't wait without a fiber].

If I remove the process_1() in the Q chain at the top and remove the .future() from sub_process_1() no exception is thrown.

Questions

  • Why does calling out to a Mongo collection within a Promise cause a fiber error within a Meteor application?
  • Does calling a async function within a sync function in general cause the sync function to become a async function?
  • How do I solve this problem?

The most common way to solve this is wrap your asynchronous callbacks that use Meteor functions in Meteor.bindEnvironment() .

If you are using the Meteor core WebApp package to handle your server side route, the code would be like this (also in meteorpad ):

WebApp.connectHandlers.use(
  '/test',
  Meteor.bindEnvironment(function(req, res, next) {
    var someSyncData = Players.findOne();
    res.write(JSON.stringify(someSyncData));
    res.end();
  })
);

Working with fibers or promises yourself is unnecessary unless you are trying to get multiple async events to run concurrently.

To deal with file reading or other functions that are not already synchronous, Meteor also provides Meteor.wrapAsync() to make them synchronous.

There are also packages and a help page that give you other high level alternatives.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM