简体   繁体   English

在将文件添加到zip文件中时,在C#中使用BinaryWriter内存不足

[英]Out of memory using BinaryWriter in C# on adding files to a zip file

I am trying to add files to a Zip file, preserving the directory. 我试图将文件添加到Zip文件中,保留目录。 The code below is basically working as long as I do not have files of a few 100 Mb to zip. 只要我没有几个要压缩的100 Mb文件,下面的代码就可以正常工作。 If I just zip a directory with 1 file of about 250 Mb (on a system with plenty of memory BTW) I get an OutOfMemory exception on the write.Write() line. 如果我仅压缩一个包含约250 Mb的文件的目录(在具有大量内存的BTW系统上),则在write.Write()行上会收到OutOfMemory异常。

I already modified the code to read in chunks as it first failed when I read/wrote the whole file. 我已经修改了代码以读取块,因为当我读取/写入整个文件时它第一次失败。 I don't know why it still fails? 我不知道为什么它仍然会失败?

    using (FileStream zipToOpen = new FileStream(cZipName, eFileMode)) 
        ZipArchiveEntry readmeEntry = archive.CreateEntry(cFileToBackup

);

    using (BinaryWriter writer = new BinaryWriter(readmeEntry.Open()))
    {
        FileStream fsData = null;                                                                // Load file into FileStream
        fsData = new FileStream(cFileFull, FileMode.Open, FileAccess.Read);
        {
            byte[] buffer = new byte[1024];
            int bytesRead = 0;
            while ((bytesRead = fsData.Read(buffer, 0, buffer.Length)) > 0)
            {
                 writer.Write(buffer,0,bytesRead); // here it fails
                 fsData.Flush(); // ->CHANGED  THIS TO writer.Flush() SOLVED IT - nearly..
            }
        }
        fsData.Close();
    }

EDIT : Arkadiusz K was right that I used the flush on the reader, not the writer. 编辑 :Arkadiusz K是对的,我在读者而非作家上使用了同花顺。 After changing that, the program zips files of 1 Gb or more where it stopped at 100 Mb first. 更改后,该程序将1 Gb或更大容量的文件压缩到最初以100 Mb停止的位置。 However, I get another exception when I try to zip eg a 6 Gb file - it stops with: System.IO.IOException was unhandled Stream was too long Source=mscorlib StackTrace: at System.IO.MemoryStream.Write(Byte[] buffer, Int32 offset, Int32 count) (etc) 但是,当我尝试压缩6 Gb文件时,我遇到另一个异常-它以以下方式停止:System.IO.IOException未处理流太长 Source = mscorlib StackTrace:位于System.IO.MemoryStream.Write(Byte []缓冲区) ,Int32偏移量,Int32计数)(等)

Does anybody have an idea why it still fails? 有人知道为什么它仍然会失败吗? I'd say th code should now properly read and write 1 Kb at a time? 我想说代码现在应该一次正确读写1 Kb?

First of all, I'd really like to format your code and make it as succinct as it should be: 首先,我真的很想格式化您的代码,并使其尽可能简洁:

var readmeEntry = archive.CreateEntry(cFileToBackup);
using (var fsData = new FileStream(cFileFull, FileMode.Open, FileAccess.Read))
using (var writer = new BinaryWriter(readmeEntry.Open()))
{
    var buffer = new byte[1024];
    int bytesRead;
    while ((bytesRead = fsData.Read(buffer, 0, buffer.Length)) > 0)
    {
         writer.Write(buffer, 0, bytesRead); // here it fails
         writer.Flush();
    }
}

Now, to explain why it fails: 现在,解释为什么失败:

BinaryWriter is a stream writer. BinaryWriter是流作家。 When it has to write the data to the stream, it usually writes it Length-prefixed and: 当必须将数据写入流时,通常将其写入长度前缀并:

Length-prefixed means that this method first writes the length of the string, in bytes, when encoded with the BinaryWriter instance's current encoding to the stream. 长度前缀表示在使用BinaryWriter实例的当前编码对流进行编码时,此方法首先将字符串的长度(以字节为单位)写入。 This value is written as an unsigned integer. 此值写为无符号整数。 This method then writes that many bytes to the stream. 然后,此方法将那么多字节写入流中。

In order to write to the file, in your case, the data is written to MemoryStream first. 为了写入文件,在您的情况下,首先将数据写入MemoryStream。 Here, MemoryStream is the backing store stream. 在这里,MemoryStream是后备存储流。 Refer to the diagram below: 请参考下图:

.NET中的流

(Image taken from: http://kcshadow.net/wpdeveloper/sites/default/files/streamd3.png ) (图片来自: http : //kcshadow.net/wpdeveloper/sites/default/files/streamd3.png

Because, either your system's memory is about 6-8GB or because your application has been allocated that much memory only, the backing store stream is expanded to the max possible when you attempt to zip a 6GB file, and throws the exception then onwards. 因为您的系统内存约为6-8GB,或者仅为应用程序分配了那么多的内存,所以当您尝试压缩6GB的文件时,后备存储流将扩展到最大可能的大小,然后引发异常。

Regarding your EDIT: ran into the same issue. 关于您的编辑:遇到了同样的问题。 After some digging, I discovered that zipFileEntry.Open() returns a WrappedStream , which is the underlying stream (the one that cannot be flushed until finished writing to it). 经过一番挖掘后,我发现zipFileEntry.Open()返回了WrappedStream ,它是基础流(在完成写入之前无法刷新的流)。

This WrappedStream is the problem: its max length is of ~2GB. 这是WrappedStream的问题:最大长度为WrappedStream I couldn't find a way to get around this, so I ended up using a different compression library altogether. 我找不到解决该问题的方法,因此最终完全使用了另一个压缩库。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM