简体   繁体   中英

How to send huge datapool as response in node.js?

In node.js application I take a huge datapool (200000 rows) from MySQL database.

await pool.query(query, binds, function (error, results) {
    if (error) throw error;
    res.send(JSON.stringify(results));
});

I think the best way is sending data in chunks and streams instead of everything at once. I tried to use JSONStream package for this task but it's really confused. Can someone show me correct way to send huge datapool as response in node.js by this package?

Here's an example:

app.get('/data', (req, res) => {
  // Make sure to set the correct content type
  res.set('content-type', 'application/json');

  // Stream the results
  pool.query(query, binds)
      .stream()
      .pipe(JSONStream.stringify())
      .pipe(res);
});

The trick is to call the .stream() method on the query, to stream the results, and pipe those results through JSONStream to stringify them into a JSON stream, which subsequently gets piped to the response.

More info on mysql streaming here:

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM