[英]System.OutOfMemoryException when reading content of a file in a Web API
I want to send the content of file as memory stream
to S3 bucket via Amazon Firehose. 我想通过Amazon Firehose将文件内容作为
memory stream
发送到S3存储桶。 below is my attempt which works fine for small files, but I have a file of 1 GB and I am getting {"Exception of type 'System.OutOfMemoryException' was thrown."}
. 下面是我的尝试,该尝试对小文件有效,但是我有一个1 GB的文件,并且正在获取
{"Exception of type 'System.OutOfMemoryException' was thrown."}
。
My code snippet: 我的代码段:
[HttpPost]
public async Task<bool> Upload()
{
try
{
var filesReadToProvider = await Request.Content.ReadAsMultipartAsync();
foreach (var stream in filesReadToProvider.Contents)
{
var fileBytes = await stream.ReadAsByteArrayAsync(); // THIS IS WHERE EXCEPTION COMES
using (MemoryStream memoryStream = new MemoryStream(fileBytes))
{
PutRecordRequest putRecord = new PutRecordRequest();
putRecord.DeliveryStreamName = myStreamName;
Record record = new Record();
record.Data = memoryStream;
putRecord.Record = record;
await kinesisClient.PutRecordAsync(putRecord);
}
}
}
catch (Exception e)
{
Console.WriteLine(e);
throw;
}
return true;
}
I looked into this link OutOfMemoryExceptoin but I could not comprehend it. 我查看了此链接OutOfMemoryExceptoin,但无法理解。 Please help me.
请帮我。
Attempt 1: 尝试1:
var filesReadToProvider = await Request.Content.ReadAsMultipartAsync();
foreach (var stream in filesReadToProvider.Contents)
{
var fileByte = await stream.ReadAsStreamAsync();
MemoryStream _ms = new MemoryStream();
fileByte.CopyTo(_ms); // EXCEPTION HERE
try
{
PutRecordRequest putRecord = new PutRecordRequest();
putRecord.DeliveryStreamName = myStreamName;
Record record = new Record();
record.Data = _ms;
putRecord.Record = record;
await kinesisClient.PutRecordAsync(putRecord);
}
catch (Exception ex)
{
Console.WriteLine("Failed to send record to Kinesis. Exception: {0}", ex.Message);
}
}
[HttpPost]
public async Task<bool> Upload()
{
try
{
using(var requestStream = await Request.Content.ReadAsStreamAsync())
{
PutRecordRequest putRecord = new PutRecordRequest();
putRecord.DeliveryStreamName = myStreamName;
Record record = new Record();
record.Data = requestStream ;
putRecord.Record = record;
await kinesisClient.PutRecordAsync(putRecord);
}
}
catch (Exception e)
{
Console.WriteLine(e);
throw;
}
return true;
}
This will read the data in chunks. 这将分块读取数据。 Keep everything in the Stream so you don't keep all the bytes around in a huge array.
将所有内容保留在Stream中,这样就不会将所有字节保留在一个巨大的数组中。
When reading large files, I use StreamReader's Readline() method. 读取大文件时,我使用StreamReader的Readline()方法。 It works on large files as it manages file system caching internally.
它在大型文件上工作,因为它在内部管理文件系统缓存。 Can you use this method, instead?
您可以使用这种方法吗? Is there a reason why you are implementing the MemoryStream class?
您为什么要实现MemoryStream类? You have a comment asking how to inject the data?
您有一条评论,询问如何注入数据? Did you try using one of MemoryStream's methods???
您是否尝试使用MemoryStream的方法之一???
https://docs.microsoft.com/en-us/dotnet/api/system.io.memorystream?view=netframework-4.7.2 https://docs.microsoft.com/zh-cn/dotnet/api/system.io.memorystream?view=netframework-4.7.2
Update: 更新:
Not sure if this is helpful since the code is substantially different from what you are using. 不确定这是否有帮助,因为代码与您使用的代码有很大不同。 But, yours isn't working either, so just a suggestion.
但是,您的也不起作用,所以只是一个建议。
http://www.tugberkugurlu.com/archive/efficiently-streaming-large-http-responses-with-httpclient http://www.tugberkugurlu.com/archive/ficiently-streaming-large-http-responses-with-httpclient
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.