简体   繁体   English

实体框架-OutOfMemory异常

[英]Entity Framework - OutOfMemory Exception

I'm developing a Silverlight Business Application and want to implement a "multipart" upload, which splits a single file into parts with a size of 4096KB. 我正在开发Silverlight商业应用程序,并希望实现“多部分”上传,它将单个文件拆分为4096KB的大小。 To upload these parts from client to server, I'm using a WebClient (client side) and a generic handler (*.ashx, server side). 要将这些部分从客户端上传到服务器,我正在使用WebClient(客户端)和通用处理程序(* .ashx,服务器端)。

Strategy: With the first part a new instance of an Entity Framework class is created. 策略:在第一部分中,创建了一个实体框架类的新实例。 This object has a field/property "binary" (in SQL it's a varbinary(MAX) and in Entity Framework it's a byte[]). 该对象具有字段/属性“ binary”(在SQL中为varbinary(MAX),在Entity Framework中为byte [])。 I store the first part in the property "binary" and execute SaveChanges(). 我将第一部分存储在属性“ binary”中并执行SaveChanges()。 Then, the handler returns the ID (primary key) of this new object to the client. 然后,处理程序将此新对象的ID(主键)返回给客户端。

The second request to the server contains, beside the second part of my file, the ID returned after the first request. 对服务器的第二个请求,除了文件的第二部分之外,还包含在第一个请求之后返回的ID。 On the server, I load the previously created object from the database and append the second part. 在服务器上,我从数据库中加载先前创建的对象,然后追加第二部分。

myobject.binary = myobject.binary.Concat(bytes).ToArray<byte>();

myobject is the previously created object, bytes the part I want to append to the binary property. myobject是先前创建的对象,将要附加到二进制属性的部分字节化

I repeat this "strategy" until the whole file is uploaded to the server. 我重复这种“策略”,直到将整个文件上传到服务器。 This works fine for files with a maximum size of ~78MB. 对于最大大小约为78MB的文件,此方法可以正常工作。 For files with a size of ~83MB it works sporadic. 对于大小为〜83MB的文件,它偶尔会起作用。 Files with a size of ~140MB will abort with a OutOfMemory Exception at SaveChanges(). 大小约为140MB的文件将在SaveChanges()处因OutOfMemory异常而中止。

StackTrace 堆栈跟踪

at System.Object.MemberwiseClone()
at System.Array.Clone()
at System.Data.Common.CommandTrees.DbConstantExpression..ctor(TypeUsage resultType, Object value)
at System.Data.Mapping.Update.Internal.UpdateCompiler.GenerateValueExpression(EdmProperty property, PropagatorResult value)
at System.Data.Mapping.Update.Internal.UpdateCompiler.BuildSetClauses(DbExpressionBinding target, PropagatorResult row, PropagatorResult originalRow, TableChangeProcessor processor, Boolean insertMode, Dictionary`2& outputIdentifiers, DbExpression& returning, Boolean& rowMustBeTouched)
at System.Data.Mapping.Update.Internal.UpdateCompiler.BuildUpdateCommand(PropagatorResult oldRow, PropagatorResult newRow, TableChangeProcessor processor)
at System.Data.Mapping.Update.Internal.TableChangeProcessor.CompileCommands(ChangeNode changeNode, UpdateCompiler compiler)
at System.Data.Mapping.Update.Internal.UpdateTranslator.<ProduceDynamicCommands>d__0.MoveNext()
at System.Linq.Enumerable.<ConcatIterator>d__71`1.MoveNext()
at System.Data.Mapping.Update.Internal.UpdateCommandOrderer..ctor(IEnumerable`1 commands, UpdateTranslator translator)
at System.Data.Mapping.Update.Internal.UpdateTranslator.ProduceCommands()
at System.Data.Mapping.Update.Internal.UpdateTranslator.Update(IEntityStateManager stateManager, IEntityAdapter adapter)
at System.Data.EntityClient.EntityAdapter.Update(IEntityStateManager entityCache)
at System.Data.Objects.ObjectContext.SaveChanges(SaveOptions options)
at MyObjectContext.SaveChanges(SaveOptions options) in PathToMyEntityModel.cs:Line 83.
at System.Data.Objects.ObjectContext.SaveChanges()
at MultipartUpload.ProcessRequest(HttpContext context) in PathToGenericHandler.ashx.cs:Line 73.

Does anyone has an idea, what's wrong with my implementation? 有谁知道我的实现有什么问题吗? If you need more information or code snippets, please let me know it. 如果您需要更多信息或代码段,请告诉我。

Kind regards, Chris 亲切的问候,克里斯

Think about it. 想一想。 After having uploaded (for example) 130 MB, how much memory is required to execute this line: 上传(例如)130 MB后,执行此行需要多少内存:

myobject.binary = myobject.binary.Concat(bytes).ToArray<byte>();

Obviously, the previous array is in memory, that's 130 MB. 显然,先前的阵列在内存中,即130 MB。 And somehow the new array must be in memory too, that another 130 MB, right? 而且新数组也必须以某种方式存储在内存中,另外130 MB,对吗?

It is actually a lot worse. 实际上,情况更糟。 Concat() is producing a sequence, and ToArray() doesn't know how big it will be. Concat()正在产生一个序列,而ToArray()不知道它将有多大。

So what .ToArray() does, is that it creates an internal buffer and starts filling it with the output from the .Concat() iterator. 所以.ToArray()作用是创建一个内部缓冲区,并开始使用.Concat()迭代器的输出填充它。 Obviously, it does not know how big the buffer should be, so every once in a while it will find that there are more bytes comming in than its buffer can hold. 显然,它不知道缓冲区应该有多大,因此每隔一段时间,它将发现传入的字节数超过了其缓冲区可以容纳的字节数。 It then needs to create a bigger buffer. 然后,它需要创建一个更大的缓冲区。 What it will do is create a buffer that is twice as big as the previous one, copy everyting over and start using the new buffer. 它要做的就是创建一个缓冲区,其大小是前一个缓冲区的两倍,复制所有内容并开始使用新缓冲区。 But that means that at some point, the old buffer and the new one must be in memory at the same time. 但这意味着在某个时候,旧缓冲区和新缓冲区必须同时在内存中。

At some point, the old buffer will be 128 MB, and the new buffer will be 256 MB. 在某些时候,旧缓冲区将为128 MB,新缓冲区将为256 MB。 Together with the 130 MB of the old file, that is about half a gigabyte. 再加上130 MB的旧文件,大约是半GB。 Now let's hope no two (or more) users do this at the same time. 现在,我们希望没有两个(或更多)用户同时执行此操作。

I would suggest you use a different mechanism. 我建议您使用其他机制。 For example, store your uploaded chuncks in a temporary file on disk. 例如,将上传的块存储在磁盘上的临时文件中。 When a new chunck comes in, just append to the file. 当出现新的块时,只需将其追加到文件中即可。 Only when the upload is completed, do whatever it is that you have to do to the file, eg store it in the database. 仅在上传完成后,才对文件执行所有操作,例如将其存储在数据库中。

Also, be aware that the maximum size of an array in .NET is limited by a 31 bit index. 另外,请注意,.NET中数组的最大大小受31位索引限制。 So the maximum size for a byte array is 2 GB, no matter how much RAM you have in the system. 因此,无论系统中有多少RAM,字节数组的最大大小均为2 GB。

Finally: if you're dealing with memory blocks this big, make sure that you are running in a 64 bit process, and at least on .NET 4.5, so you can take advantage of the Large Object Heap Improvements in .NET 4.5 . 最后:如果要处理这么大的内存块,请确保您正在64位进程中运行,至少在.NET 4.5上运行,以便可以利用.NET 4.5中的大对象堆改进 But even that isn't magic as “Out Of Memory” Does Not Refer to Physical Memory . 但这并不是魔术,因为“内存不足”并不表示物理内存

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM