簡體   English   中英

使用nodejs中的流將多個文件寫入http響應

[英]Write multiple files to http response with streams in nodejs

我有一組文件,我必須將它們打包到 gzip 存檔中,並通過 http 即時響應發送它們。 這意味着我無法將整個文件存儲在 memory 中,但我必須將它們同步 pipe 到 tar.entry 中,否則一切都會中斷。

const tar = require('tar-stream'); //lib for tar stream
const { createGzip } = require('zlib'); //lib for gzip stream

//large list of huge files.
const files = [ 'file1', 'file2', 'file3', ..., 'file99999' ];
...

//http request handler:
const pack = tar.pack(); //tar stream, creates .tar
const gzipStream = createGzip(); //gzip stream so we could reduce the size

//pipe archive data trough gzip stream
//and send it to the client on the fly
pack.pipe(gzipStream).pipe(response);

//The issue comes here, when I need to pass multiple files to pack.entry
files.forEach(name => {
    const src = fs.createReadStream(name);    //create stream from file
    const size = fs.statSync(name).size;      //determine it's size
    const entry = pack.entry({ name, size }); //create tar entry

    //and this ruins everything because if two different streams
    //writes smth into entry, it'll fail and throw an error
    src.pipe(entry);
});

基本上我需要 pipe 來完成發送數據(類似於await src.pipe(entry); ),但 nodejs 中的管道不這樣做。 那么有什么辦法可以繞過它嗎?

沒關系,在這種情況下不要使用 forEach

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM