简体   繁体   English

将大量数据直接从 mongodb 查询流式传输到 nodejs 响应中

[英]Streaming large amount of data directly from mongodb query into nodejs response

I am building MERN stack application and at one moment I need to fetch large amount of data from mongodb database so I've came across mongodb streams and I was wondering what is the best way to implement this.我正在构建 MERN 堆栈应用程序,有一次我需要从 mongodb 数据库中获取大量数据,所以我遇到了 mongodb 流,我想知道实现这一点的最佳方法是什么。 I've tried these two options:我已经尝试了这两个选项:

  var stream = Product.find({}).stream();

    stream.on('error', (err) => {
        console.error(err)
    });

    stream.on('data', (doc) => {
        return res.json(doc);
    });

In this example I receive an error Cannot set headers after they are sent to the client .在这个例子中,我收到一个错误Cannot set headers after they are sent to the client Second try:第二次尝试:

   await Product.find({})
  .cursor()
  .pipe(JSON.stringify())
  .pipe(res);

In this example I get Cannot read property 'on' of undefined .在此示例中,我得到无法读取 undefined 的属性“on” I was not able to find proper explanation of whole cursor().pipe() chaining so if anyone knows I'll be more than glad if you could explain this logic.我无法找到对整个 cursor().pipe() 链接的正确解释,所以如果有人知道,如果你能解释这个逻辑,我会非常高兴。 I do not want to use pagination in this example.我不想在这个例子中使用分页。

You can send the response to the client only once.您只能将响应发送给客户端一次。 once it is send the connection gets closed.一旦发送,连接就会关闭。 In your case your trying to sending the response more than once hence the error.在您的情况下,您尝试多次发送响应,因此出现错误。

you need to use websockets or server-sent-events for the purpose of streaming the data.您需要使用 websockets 或 server-sent-events 来传输数据。

I usually use this flow:我通常使用这个流程:

var stream = Product.find({}).stream();
const results = [];

stream.on('error', (err) => {
  console.error(err)
});

stream.on('data', (doc) => {
  results.push(doc);
});

stream.on('end', () => {
  res.json(results)
})

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM