简体   繁体   English

使用nodejs中的流将多个文件写入http响应

[英]Write multiple files to http response with streams in nodejs

I have an array of files that I have to pack into a gzip archive and send them through http response on the fly.我有一组文件,我必须将它们打包到 gzip 存档中,并通过 http 即时响应发送它们。 That means I can't store the whole file in the memory yet I have to synchronously pipe them into tar.entry or everything is going to break.这意味着我无法将整个文件存储在 memory 中,但我必须将它们同步 pipe 到 tar.entry 中,否则一切都会中断。

const tar = require('tar-stream'); //lib for tar stream
const { createGzip } = require('zlib'); //lib for gzip stream

//large list of huge files.
const files = [ 'file1', 'file2', 'file3', ..., 'file99999' ];
...

//http request handler:
const pack = tar.pack(); //tar stream, creates .tar
const gzipStream = createGzip(); //gzip stream so we could reduce the size

//pipe archive data trough gzip stream
//and send it to the client on the fly
pack.pipe(gzipStream).pipe(response);

//The issue comes here, when I need to pass multiple files to pack.entry
files.forEach(name => {
    const src = fs.createReadStream(name);    //create stream from file
    const size = fs.statSync(name).size;      //determine it's size
    const entry = pack.entry({ name, size }); //create tar entry

    //and this ruins everything because if two different streams
    //writes smth into entry, it'll fail and throw an error
    src.pipe(entry);
});

Basically I need for the pipe to complete sending data (smth like await src.pipe(entry); ), but pipes in nodejs don't do that.基本上我需要 pipe 来完成发送数据(类似于await src.pipe(entry); ),但 nodejs 中的管道不这样做。 So is there any way I could get around it?那么有什么办法可以绕过它吗?

Nevermind, just don't use forEach in this case没关系,在这种情况下不要使用 forEach

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM