简体   繁体   English

大字符串的可读流实现

[英]Readable stream implementation for big strings

I want to implement a readable stream for strings that can have a size of 1MB+, they are created by processing some files of a big size too. 我想为大小可以超过1MB的字符串实现可读流,它们也是通过处理一些大文件而创建的。 These strings will be sent by HTTP protocol to multiple clients and it is needed to send their content asynchronously. 这些字符串将通过HTTP协议发送到多个客户端,并且需要异步发送其内容。 The question is: How should I organize the string spliting / reading and what size should I use for the chunked data? 问题是:我应该如何组织字符串拆分/读取以及对分块数据应使用什么大小? In the File System module, as I observed, it is used the size of 64KB when streaming file. 正如我观察到的那样,在文件系统模块中,流文件时使用的大小为64KB。

Check out GridFS in MongoDB. 在MongoDB中签GridFS。

You can handle files in chunks in MongoDB using GridFS . 您可以使用GridFS在MongoDB GridFS块处理文件。

It is pretty straightforward. 这很简单。

Save with file type and download it with appropriate file type. 使用文件类型保存并使用适当的文件类型下载。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM