简体   繁体   English

使用 Newtonsoft.Json 将 Json 直接序列化到 AWS S3 存储桶中

[英]Serializing Json directly into AWS S3 bucket with Newtonsoft.Json

I have an object that has to be converted to Json format and uploaded via Stream object.我有一个对象必须转换为 Json 格式并通过 Stream 对象上传。 This is the AWS S3 upload code:这是 AWS S3 上传代码:

        AWSS3Client.PutObjectAsync(new PutObjectRequest()
        {
            InputStream = stream,
            BucketName = name,
            Key = keyName
        }).Wait();

Here stream is Stream type which is read by AWSS3Client.这里的是由 AWSS3Client 读取的流类型。 The data that I am uploading is a complex object that has to be in Json format.我上传的数据是一个复杂的对象,必须是 Json 格式。

I can convert object to string using JsonConvert.SerializeObject or serialize to file using JsonSerializer but since amount of data is quite significant I would prefer to avoid temporary string or file and convert object to readable Stream right away.我可以使用 JsonConvert.SerializeObject 将对象转换为字符串,或者使用 JsonSerializer 将对象序列化为文件,但由于数据量非常大,我宁愿避免使用临时字符串或文件,并立即将对象转换为可读流。 My ideal code would look something like this:我的理想代码如下所示:

        AWSS3Client.PutObjectAsync(new PutObjectRequest()
        {
            InputStream = MagicJsonConverter.ToStream(myDataObject),
            BucketName = name,
            Key = keyName
        }).Wait();

Is there a way to achieve this using Newtonsoft.Json ?有没有办法使用 Newtonsoft.Json 实现这一目标?

你在这里需要两件事:一个是生产者/消费者流,例如来自这个 StackOverflow 问题的BlockingStream ,第二个是 Json.Net 序列化程序写入这个流,就像在另一个 SO 问题中一样

Another practical option is to wrap the memory stream with gzip stream (2 lines of code).另一个实用的选择是用 gzip 流(2 行代码)包装内存流。
Usually, JSON files will have great compression (1GB file can be compressed to 50MB).通常,JSON 文件会有很大的压缩(1GB 文件可以压缩到 50MB)。
Then when serving the stream to S3, wrap it with gzip stream which decompresses it.然后,当将流提供给 S3 时,用 gzip 流将其包装起来,以解压缩它。
I guess the trade-off comparing to temp file is CPU vs IO (both will probably work well).我想与临时文件相比的权衡是 CPU 与 IO(两者都可能运行良好)。 If you can save it compressed on S3 it will save you space and increase networking efficiency too.如果您可以将其压缩保存在 S3 上,它将节省您的空间并提高网络效率。
Example code:示例代码:

var compressed = new MemoryStream();
using (var zip = new GZipStream(compressed, CompressionLevel.Fastest, true))
{
    -> Write to zip stream...
}
compressed.Seek(0, SeekOrigin.Begin);
-> Use stream to upload to S3

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM