I am using the Robinhood-node API to retrieve my security portfolios. The API has two functions that I use -
orders(callbackfn)
Retrieves the last 100 orders and a link to the next page.
url(url,callback)
To fetch the next page with 100 more orders.
Both these functions return a JSON that looks like -
{
results:[
{order 0},
{order -1},
...
{order -100}],
next: url_for_next_page
}
Now, I need to pass the next url to the url
function to get another 100 orders. The last page has next: null
.
The url
cannot be called till I process the results of orders
. The subsequent urls cannot be called till the previous url
results are processed. This has caused me create a callback hell :-( order -> url -> url -> url ...
I cannot call the url
method in a loop. The loop continues to execute before the callback executes.
How do I call url
repeatedly? I have so far managed a hack using recursive function calls. Here is my code - https://gist.github.com/lordloh/f5c8c589b4538ab9674db919ae2e2834
I am learning node js as I go... :-/
Have you considered using Promises
to avoid the callback hell?
I took a look at your code and I imagine something like this using Promises
:
const credentials = {
username: 'username',
password: 'Password'
};
const Robinhood = require('../robinhood-node/src/robinhood');
function orderHandler(orderArray,allOs){
return new Promise((resolve, reject) =>
{
for (let i = 0; i < orderArray.length; i++) {
var e = orderArray[i];
var ss = e.instrument.split('/');
var instrument = ss[ss.length-2];
allOs.push({'id':e.id,
'instrument':instrument,
'quantity':e.quantity,
'price':e.average_price,
'side':e.side,
'transaction_time':e.last_transaction_at
});
}
resolve(allOs);
});
}
Robinhood(credentials, () =>
{
Robinhood.orders(function(e,r,b){
orderHandler(b.results,[]).then((allOs) =>
{
urlSequence(b.next, allOs).then((allOS) => csv(allOs));
});
});
});
function urlSequence(url,allOs){
if(!url)
return Promise.resolve(allOS);
return new Promise((resolve, reject) =>
{
Robinhood.url(url, (e,r,b) =>
{
orderHandler(b.results, allOS)
.then((allOs) => {
setTimeout(() => resolve(urlSequence(b.next, allOs)), 0);
});
});
});
}
function csv(Arr){
var A;
for (var i=0;i<Arr.length;i++){
A=Arr[i];
console.log(A.id+", "+A.instrument+", "+A.quantity+", "+A.price+", "+A.quantity+", "+A.side+", "+A.transaction_time);
}
}
Well let's say we have a function called fetch
and all it does is to fetch some data and pass it to a callback, this is how it would be done:
function getAllPages(url, callback) {
// This is a function that returns a function:
const getPage = (oldData) => (response) => {
// We concat the data from the response to our collected list:
const newData = oldData.concat(response.results)
if(response.next && response.next.length) {
// Call the function returned by getPage passing in newData
// This is what we call recursion
fetch(response.next, getPage(newData))
} else {
// If there is no next page we just pass all the collected data to a callback:
callback(newData)
}
}
// We pass it an empty list as initial data:
getPage([])({next: url, results: []})
}
getAllPages('some url...', (data) => {
console.log(data)
})
I hope this will help you to get going.
If there are too many pages, you could easily add some throttling using setTimeout
or set a limit.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.