简体   繁体   中英

Best practices for handling large data with JS fetch?

Server take time (around 2min) to export a large JSON data while i got timeout error on the client side before server response. I googled around for a bit, but I cannot find anyway to extend the timeout or continue after timeout.

fetch(url).then(resolve,reject);

Couple of things you can do here..

1) Get the chunked data for example if you are displaying this data in a table look for something DataTables Server side processing.

2) If you still want the entire in single request then try using some db Indexing to make it faster

3) Increase Server time out.

I'm assuming you are using fetch in node.

Try using stream to get the data in chunks. An example to this would be:

fetch(url)
    .then(res => {
        return new Promise((resolve, reject) => {
            const dest = fs.createWriteStream('some/file/path');
            res.body.pipe(dest);
            res.body.on('error', err => {
                reject(err);
            });
            dest.on('finish', () => {
                resolve();
            });
            dest.on('error', err => {
                reject(err);
            });
        });
    });

You may use the stream as you wish further.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM