简体   繁体   English

了解 Node js 中的流

[英]Understanding streams in Node js

I am stuck on an issue where I need to create and download a zip of multiple files using NodeJs.我遇到了一个问题,我需要使用 NodeJs 创建和下载多个文件的 zip。 Things I have tried and failed :我尝试过但失败的事情:

https://github.com/archiverjs/node-archiver/issues/364#issuecomment-573508251 https://github.com/archiverjs/node-archiver/issues/364#issuecomment-573508251

https://github.com/kubernetes/kubernetes/issues/90483#issue-606722433 https://github.com/kubernetes/kubernetes/issues/90483#issue-606722433

unexpected behavior using zip-stream NPM on Google k8s 在 Google k8s 上使用 zip-stream NPM 的意外行为

In addition to this, now the files are encrypted as well so I have to decrypt them before adding them to zip on the fly.除此之外,现在文件也被加密了,所以我必须在将它们即时添加到 zip 之前解密它们。

Though I solved this also, my solution works perfectly while the server is running on a local machine but failed when on the Google Kubernetes Engine.虽然我也解决了这个问题,但我的解决方案在服务器在本地机器上运行时完美运行,但在 Google Kubernetes Engine 上失败。

After some more research, I guess this might be because of a backpressure issue in the streams in NodeJs but as described in docs, backpressure is handled by the pipe method automatically.经过更多研究,我想这可能是因为 NodeJs 中的流中存在背压问题,但如文档中所述,背压由管道方法自动处理。 Is it possible that the receiving speed of the browser is not matching with the sending speed of my server/zipping if yes how to solve this problem?是否有可能是浏览器的接收速度与我的服务器/压缩文件的发送速度不匹配,如果是,如何解决此问题?

All the samples related to the problem are in the links provided above.与问题相关的所有示例都在上面提供的链接中。

In Addition to this readable stream is passed through decipher to decrypt it.除了这个可读流之外,还通过解密来解密它。

    const handleEntries = ({ elem, uniqueFiles, archive, speedLimit }) => {
  return new Promise((resolve, reject) => {
    let fileName = elem.fileName;
    const url = elem.url;
    const decipher = elem.decipher;
    // changing fileName if same filename is already added to zip
    if (uniqueFiles[fileName] || uniqueFiles[fileName] === 0) {
      uniqueFiles[fileName]++;
    } else {
      uniqueFiles[fileName] = 0;
    }
    if (uniqueFiles[fileName]) {
      const lastDotIndex = fileName.lastIndexOf(".");
      const name = fileName.substring(0, lastDotIndex);
      const extension = fileName.substring(lastDotIndex + 1);
      fileName = `${name}(${uniqueFiles[fileName]}).${extension}`;
    }
    let readableStream = Request(url);
    // create a "Throttle" instance that reads at speedLimit bps
    if (speedLimit) {
      const throttle = new Throttle({ bps: Number(speedLimit) });
      readableStream = readableStream.pipe(throttle);
    }
    // if file is encrypted, need to decrypt it before piping to zip
    readableStream = decipher ? readableStream.pipe(decipher) : readableStream;
    archive.append(readableStream, { name: fileName });
    readableStream.on("complete", result => {
      console.log("Request stream event complete : ", fileName);
      resolve("done");
      // readableStream.unpipe();
      // readableStream.destroy();
    });
    readableStream
      .on("error", error => {
        console.log("Request stream event error fileName : ", fileName, " error : ", error);
        // readableStream.unpipe();
        // readableStream.destroy();
        resolve("done");
      })
      .on("pipe", result => {
        console.log("Request stream event pipe : ", fileName);
      })
      .on("request", result => {
        console.log("Request stream event request : ", fileName);
      })
      .on("response", result => {
        console.log("Request stream event response : ", fileName);
      })
      .on("socket", result => {
        result.setKeepAlive(true);
        console.log("Request stream event socket : ", fileName);
      });
  });
};

const useArchiver = async ({ resp, urls, speedLimit }) => {
  resp.writeHead(200, {
    "Content-Type": "application/zip",
    "Content-Disposition": `attachment; filename="${outputFileName}"`,
    "Access-Control-Allow-Origin": "*",
    "Access-Control-Allow-Methods": "GET, POST, OPTIONS"
  });
  const uniqueFiles = {};
  const archive = Archiver("zip", { zlib: 0 });
  archive.pipe(resp);
  archive
    .on("close", result => {
      console.log("archive stream event close : ", result);
      // archive.unpipe();
      // archive.destroy();
    })
    .on("drain", result => {
      console.log("archive stream event drain : ", result);
    })
    .on("entry", result => {
      console.log("archive stream event entry : ", result.stats);
    })
    .on("error", error => {
      console.log("archive stream event error : ", error);
      reject("error");
      // archive.unpipe();
      // archive.destroy();
    })
    .on("finish", result => {
      console.log("archive stream event finish : ", result);
      // archive.unpipe();
      // archive.destroy();
    })
    .on("pipe", result => {
      console.log("archive stream event pipe : ");
    })
    .on("progress", async result => {
      console.log("archive stream event progress : ", result.entries);
      if (urls.length === result.entries.total && urls.length === result.entries.processed) {
        await archive.finalize();
        console.log("finalized : ", urls[0]);
      }
    })
    .on("unpipe", result => {
      console.log("archive stream event unpipe : ");
    })
    .on("warning", result => {
      console.log("archive stream event warning : ", result);
    });
  for (const elem of urls) {
    await handleEntries({ elem, uniqueFiles, archive, speedLimit });
  }
};

I tried this code with archiver, getting drain event of archiver while zipping large files, does pipe handles back pressure or not if yes why I am getting drain event from archiver.我用存档器尝试了这段代码,在压缩大文件时获取存档器的排放事件,管道是否处理背压,如果是,为什么我从存档器获取排放事件。

Hey) I've researched your code and can say you, you use a promise function and you're not waiting until she finish.嘿)我已经研究了你的代码,可以说你使用了一个承诺函数,你不会等到她完成。 you need to wrap zipStreamer.entry with await new Promise().您需要使用 await new Promise() 包装 zipStreamer.entry。 And must be like this而且一定是这样

async function doSmth() {
   const decipher = crypto.createDecipheriv(
      algorithm,
      Buffer.from(key), 
      Buffer.from(key.substring(0, 9)
   ));
   await new Promise((resolve, reject) => {
     zipStreamer.entry(readableStream.pipe(decipher), {
      name: fileName
     }, (error, result) => {
      if (!error) {
         resolve("done");
      } else {
        reject("error");
      }
     });  
   });
}

It seems that I got the solution for this, I did a few changes in Kubernetes configurations ie increase timeout from 30 secs to 300 secs, increase CPU limit, and tested it multiple times for up to 12-13 GB file and it works like a charm.似乎我得到了解决方案,我对 Kubernetes 配置做了一些更改,即将超时从 30 秒增加到 300 秒,增加 CPU 限制,并多次测试最多 12-13 GB 的文件,它的工作原理类似于魅力。 I think increasing CPU from .5 to 1 and increasing timeout did the job for me .我认为将 CPU 从 0.5 增加到 1 并增加超时对我来说是有用的

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM