简体   繁体   English

readSync 或 createReadStream(使用 Symbol.asyncIterator)哪个更好?

[英]What's better readSync or createReadStream (with Symbol.asyncIterator)?

  1. createReadStream (with Symbol.asyncIterator) createReadStream(使用 Symbol.asyncIterator)
async function* readChunkIter(chunksAsync) {
  for await (const chunk of chunksAsync) {
    // magic
    yield chunk;
  }
}

const fileStream = fs.createReadStream(filePath, { highWaterMark: 1024 * 64 });
const readChunk = readChunkIter(fileStream);
  1. readSync读取同步
function* readChunkIter(fd) {
  // loop
    // magic
    fs.readSync(fd, buffer, 0, chunkSize, bytesRead);
    yield buffer;
}

const fd = fs.openSync(filePath, 'r');
const readChunk = readChunkIter(fd);

What's better to use with a generator function and why?与生成器函数一起使用哪个更好,为什么?

upd: I'm not looking for a better way, I want to know the difference between using these features upd:我不是在寻找更好的方法,我想知道使用这些功能之间的区别

To start with, you're comparing a synchronous file operation fs.readSync() with an asynchronous one in the stream (which uses fs.read() internally).首先,您将同步文件操作fs.readSync()与流中的异步操作fs.read()内部使用fs.read() )进行比较。 so, that's a bit like apples and oranges for server use.所以,这有点像服务器使用的苹果和橘子。

If this is on a server, then NEVER use synchronous file I/O except at server startup time because when processing requests or any other server events, synchronous file I/O blocks the entire event loop during the file read operation which drastically reduces your server scalability.如果这是在服务器上,则永远不要使用同步文件 I/O,除非在服务器启动时,因为在处理请求或任何其他服务器事件时,同步文件 I/O 会在文件读取操作期间阻塞整个事件循环,这会大大减少您的服务器可扩展性。 Only use asynchronous file I/O, which between your two cases would be the stream.仅使用异步文件 I/O,这两种情况之间将是流。

Otherwise, if this is not on a server or any process that cares about blocking the node.js event loop during a synchronous file operation, then it's entirely up to you on which interface you prefer.否则,如果这不在服务器或任何关心在同步文件操作期间阻塞 node.js 事件循环的进程上,那么这完全取决于您喜欢哪个接口。


Other comments:其他的建议:

  1. It's also unclear why you wrap for await() in a generator.也不清楚为什么在生成器中包装for await() The caller can just use for await() themselves and avoid the wrapping in a generator.调用者可以自己使用for await()并避免包装在生成器中。

  2. Streams for reading files are usually used in an event driven manner by adding an event listener to the data event and responding to data as it arrives.用于读取文件的流通常以事件驱动的方式使用,通过向data事件添加事件侦听器并在data到达时响应数据。 If you're just going to asynchronously read chunks of data from the file, there's really no benefit to a stream.如果您只是要从文件中异步读取数据块,那么流实际上没有任何好处。 You may as well just use fs.read() or fs.promises.read() .您也可以只使用fs.read()fs.promises.read()

  3. We can't really comment on the best/better way to solve a problem without seeing the overall problem you're trying to code for.如果没有看到您尝试编码的整体问题,我们无法真正评论解决问题的最佳/更好方法。 You've just shown one little snippet of reading data.您刚刚展示了一小段阅读数据。 The best way to structure that depends upon how the higher level code can most conveniently use/consume the data (which you don't show).结构的最佳方式取决于更高级别的代码如何最方便地使用/消费数据(您没有显示)。

I really didn't ask the right question.我真的没有问正确的问题。 I'm not looking for a better way, I want to know the difference between using these features.我不是在寻找更好的方法,我想知道使用这些功能之间的区别。

Well, the main difference is that fs.readSync() is blocking and synchronous and thus blocks the event loop, ruining the scalability of a server and should never be used (except during startup code) in a server environment.嗯,主要区别在于fs.readSync()是阻塞和同步的,因此阻塞了事件循环,破坏了服务器的可扩展性,并且永远不应该在服务器环境中使用(除了在启动代码期间)。 Streams in node.js are asynchronous and do not block the event loop. node.js 中的流是异步的,不会阻塞事件循环。

Other than that difference, streams are a higher level construct than just reading the file directly and should be used when you're actually using features of the streams and should probably not be used when you're just reading chunks from the file directly and aren't using any features of streams.除此之外,流是一种比直接读取文件更高级的构造,应该在您实际使用流的功能时使用,并且在您直接从文件中读取块时可能不应该使用并且不'不使用流的任何功能。

In particular, error handling is not always so clear with streams, particularly when trying to use await and promises with streams.特别是,流的错误处理并不总是那么清晰,尤其是在尝试await使用await和 promises 时。 This is probably because readstreams were originally designed to be an event driven object and that means communicating errors indirectly on an error event which complicates the error handling on straight read operations.这可能是因为读取流最初被设计为事件驱动的对象,这意味着在error事件上间接传达错误,这使直接读取操作的错误处理复杂化。 If you're not using the event driven nature of readstreams or some transform feature or some other major feature of streams, I wouldn't use them - I'd use the more traditional fs.promises.readFile() to just read data.如果您不使用 readstreams 的事件驱动性质或某些转换功能或流的某些其他主要功能,我不会使用它们 - 我会使用更传统的fs.promises.readFile()来读取数据。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 使用javascript的Symbol.asyncIterator与等待循环 - Using javascript's Symbol.asyncIterator with for await of loop 创建异步迭代器的最佳实践是什么? 我应该使用异步生成器 function 还是使用 Symbol.asyncIterator? - What is the best practice to create an async iterator? Should I use an async generator function or rather use Symbol.asyncIterator? 尝试将异步迭代器与 Mongoose 5.11.14 一起使用时出现 Typescript 错误 TS2504:必须具有“[Symbol.asyncIterator]()” - Typescript error TS2504 when trying to use async iterators with Mongoose 5.11.14: must have a '[Symbol.asyncIterator]()' fs.createReadStream 获得的路径与传入的路径不同 - fs.createReadStream getting a different path than what's being passed in TypeScript 中 createReadStream 的类型应该是什么? - What should be the type of createReadStream in TypeScript? “createReadStream”和“Readable”class 有什么区别? - What is the difference between "createReadStream" and "Readable" class? SFTP createReadStream不写入s3存储桶 - SFTP createReadStream not writing into s3 bucket s3.getObject().createReadStream() :如何捕捉错误? - s3.getObject().createReadStream() : How to catch the error? fs.createReadStream(path [,options]) - 这些选项是什么? - fs.createReadStream(path[, options]) - What are those options? 循环中的 AWS S3 CreateReadStream 仅读取和写入 1 个文件 - AWS S3 CreateReadStream in a loop only reads and writes 1 file
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM