简体   繁体   English

通过 stream 将数据块写入 S3 文件,而不是创建临时文件 Node.js

[英]Write data chunks to S3 file through stream instead of creating temporary file Node.js

I am trying to write chunks of CSV formatted data to a file in Amazon S3 instead of writing to a temporary file through the WriteStream then creating a ReadStream on that file and sending it to S3.我正在尝试将 CSV 格式的数据块写入 Amazon S3 中的文件,而不是通过 WriteStream 写入临时文件,然后在该文件上创建 ReadStream 并将其发送到 S3。 My program pulls rows of data from a database, processed it then formats each row in a CSV like so using S3's upload() api我的程序从数据库中提取数据行,对其进行处理,然后使用 S3 的 upload() api 格式化 CSV 中的每一行

let recordsCSVFormatted;
let offset = 0;
const batchSize = 500;
const writer = fs.createWriteStream('./someFile.csv')

do {
  recordsCSVFormatted = await getRecords(limit, offset); // gets records from DB, formats it in CSV string
  writer.write(recordsCSVFormatted);
  offset += batchSize;
} while (typeof recordsCSVFormatted === 'undefined' || (recordsCSVFormatted && recordsCSVFormatted.length))

const reader = fs.createReadStream('./someFile.csv');

// just assume here that Key and Bucket are provided in upload, they are in actual code
await new AWS.S3({...s3Opts}).upload({Body: reader}).promise() // pass the readable in here for AWS

How can I skip the step of creating a temporary file and then passing the file to AWS as a stream?如何跳过创建临时文件然后将文件作为 stream 传递给 AWS 的步骤? I want to be able to stream the chunks of CSV information directly.我希望能够直接获取 CSV 信息块的 stream。

Solved this by implementing the Readable class and implementing a custom read() function for S3 upload to consume通过实现 Readable class 和实现自定义 read() function 来解决这个问题,用于 S3 上传消费

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM