簡體   English   中英

當一個 stream 依賴於另一個時,如何處理 node.js 流的情況?

[英]How to handle situation with node.js streams when one stream is dependant on another one?

我正在開發批量上傳的功能,我遇到了這個問題。

我想存檔文件並將其上傳到我的服務器。 此外,存檔將包含一個清單文件 - 它將描述具有各種屬性/元數據/等的每個文件。

當我想發回響應時會出現問題。 正在讀取清單文件的 stream 已關閉,從而導致立即執行回調。 下面我將展示示例。

const csv = require("fast-csv");
const fs = require("fs");
const path = require("path");

async function proccesUpload() {
  const manifestReadStream = fs.createReadStream(
    path.join(__dirname, "manifest.txt")
  );

  manifestReadStream
    .pipe(
      csv.parse({
        delimiter: ";",
      })
    )
    .on("data", async (row) => {
        // do processing for each file described in manifest file
      const hash = crypto.createHash("sha1");
      const rs = fs.createReadStream(targetFile, {
        flags: "r",
        autoClose: true,
      });
      rs.on("data", (data) => hash.update(data, "utf-8"));
      rs.on("close", function onReadStreamClose() {
        // do proccessing for file
      });
    })
    .on("end", async () => {
      // return response when all formating was performed
    });
}

通過使用嵌套讀取 stream,在處理所有文件之前執行“結束”。 我該如何解決這個問題?

我建議使用異步迭代器將使代碼更容易並且無需回調

async function proccesUpload() {
  const manifestReadStream = fs.createReadStream(
    path.join(__dirname, "manifest.txt")
  );

  const parserStream = manifestReadStream.pipe(
    csv.parse({
      delimiter: ";",
    })
  );

  for await (const row of parserStream) {
    // do processing for each file described in manifest file
    const hash = crypto.createHash("sha1");
    const rs = fs.createReadStream(targetFile, {
      flags: "r",
      autoClose: true,
    });
    for await (const data of rs) {
      hash.update(data, "utf-8");
    }
    // DONE PROCESSING THE ROW
  }

  // DONE PROCESSING ALL FILES
  // return response when all formating was performed
}

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM