I have a function that loops through content and calls back on the result. That result I then add to an async queue with a concurrency of 1. I then make a request with the content supplied and stream the result to a file. But for some reason, the queue seems to hang up after three calls and I cannot figure out why.
var async = require('async');
var request = require('request');
var queue = async.queue(function (url, callback) {
var file = fs.createWriteStream('./images/' + url.split('/')[5])
var image = request(url);
image.pipe(file);
file.on('close', function() {
callback('done')
})
},1);
getUrls(query, function(e, url) {
queue.push(url, function(data) {
console.log(data)
});
});
queue.drain = function() {
console.log('all items have been processed');
}
Assume all files are just random images like this
http://pandodaily.files.wordpress.com/2014/03/google-in-bed-w-mercenaries-n-military-e1395865855795.jpg?w=900&h=499
Also assume that getUrls
is spitting out 5 - 10 urls into the queue.
So basically, the first couple of requests make it through and pipe to a file, but for some reason after that is hangs and never hits the drain function.
Try listening to finish
instead of close
before calling the callback. close
is for the Readable stream, where finish
is for the Writable stream.
file.on('finish', function() { callback('done'); })
Refer to the API docs for more details: http://nodejs.org/api/stream.html#stream_event_finish
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.