簡體   English   中英

C#中的ushort數組壓縮

[英]ushort array compression in C#

我有一個ushort數組(實際上是來自相機的圖像),我希望在持久存儲之前進行無損壓縮。 我正在使用System.IO.Compression.GZipStream中提供的GZipStream函數。 據我所知,這種方法要求我將ushort數組轉換為字節數組。 我的解決方案似乎可以正常運行,但是並沒有我想要的那么快。 圖像的原始大小約為2 MB,並且壓縮時間范圍(在我的慢速計算機上)為200-400毫秒,而解壓縮時間范圍為100-200毫秒。 尋找改善我的表現的建議。

public static class Zip
    {

        public static ushort[] Decompress_ByteToShort(byte[] zippedData)
        {
            byte[] decompressedData = null;            

            using (MemoryStream outputStream = new MemoryStream())
            {
                using (MemoryStream inputStream = new MemoryStream(zippedData))
                {
                    using (GZipStream zip = new GZipStream(inputStream, CompressionMode.Decompress))
                    {
                        zip.CopyTo(outputStream);
                    }
                }
                decompressedData = outputStream.ToArray();
            }

            ushort[] decompressShort = new ushort[decompressedData.Length / sizeof(ushort)];
            Buffer.BlockCopy(decompressedData, 0, decompressShort, 0, decompressedData.Length);

            return decompressShort;
        }



        public static byte[] Compress_ShortToByte(ushort[] plainData)
        {
            byte[] compressesData = null;

            byte[] uncompressedData = new byte[plainData.Length * sizeof(ushort)];
            Buffer.BlockCopy(plainData, 0, uncompressedData, 0, plainData.Length * sizeof(ushort));

            using (MemoryStream outputStream = new MemoryStream())
            {
                using (GZipStream zip = new GZipStream(outputStream, CompressionMode.Compress))
                {
                    zip.Write(uncompressedData, 0, uncompressedData.Length);
                }
                //Dont get the MemoryStream data before the GZipStream is closed 
                //since it doesn’t yet contain complete compressed data.
                //GZipStream writes additional data including footer information when its been disposed
                compressesData = outputStream.ToArray();
            }

            return compressesData;
        }

    }

我看到的方法中的第一個問題是,您正在使用字節數組,而不是直接加載和寫入文件。 使用較小的臨時緩沖區並直接在塊中直接讀取/寫入流和文件應該更快。

在這里,我提出了一些函數和重載,可用於從字節數組解壓縮到字節數組,從流,到流,從文件和文件解壓縮。

性能提高應該在10%到20%之間。 嘗試根據需要調整常數。 我用DeflateStream代替了GZipStream,這提高了性能。 如果願意,可以返回GZipStream。

我只嘗試將字節轉換為ushort,將ushort轉換為代碼的byte []版本,它的速度提高了約10%。 直接訪問文件而不是將文件加載到大緩沖區中應該可以進一步提高性能。

警告:這種以這種方式讀取和寫入圖像的方法與Little-Endian / Big-Endian無關,這意味着從Intel / AMD機器保存的文件與ARM機器不兼容,例如在某些平板電腦中! 恰如其分:)

    /// <summary>The average file size, used to preallocate the right amount of memory for compression.</summary>
    private const int AverageFileSize = 100000;

    /// <summary>The default size of the buffer used to convert data. WARNING: Must be a multiple of 2!</summary>
    private const int BufferSize = 32768;


    /// <summary>Decompresses a byte array to unsigned shorts.</summary>
    public static ushort[] Decompress_ByteToShort(byte[] zippedData)
    {
        using (var inputStream = new MemoryStream(zippedData))
            return Decompress_File(inputStream);
    }

    /// <summary>Decompresses a file to unsigned shorts.</summary>
    public static ushort[] Decompress_File(string inputFilePath)
    {
        using (var stream = new FileStream(inputFilePath, FileMode.Open, FileAccess.Read))
            return Decompress_File(stream);
    }

    /// <summary>Decompresses a file stream to unsigned shorts.</summary>
    public static ushort[] Decompress_File(Stream zippedData)
    {
        using (var zip = new DeflateStream(zippedData, CompressionMode.Decompress, true))
        {
            // Our temporary buffer.
            var buffer = new byte[BufferSize];

            // Read the number of bytes, written initially as header in the file.
            zip.Read(buffer, 0, sizeof(int));
            var resultLength = BitConverter.ToInt32(buffer, 0);

            // Creates the result array
            var result = new ushort[resultLength];

            // Decompress the file chunk by chunk
            var resultOffset = 0;
            for (; ; )
            {
                // Read a chunk of data
                var count = zip.Read(buffer, 0, BufferSize);
                if (count <= 0)
                    break;

                // Copy a piece of the decompressed buffer
                Buffer.BlockCopy(buffer, 0, result, resultOffset, count);

                // Advance counter
                resultOffset += count;
            }

            return result;
        }
    }

    /// <summary>Compresses an ushort array to a file array.</summary>
    public static byte[] Compress_ShortToByte(ushort[] plainData)
    {
        using (var outputStream = new MemoryStream(AverageFileSize))
        {
            Compress_File(plainData, outputStream);
            return outputStream.ToArray();
        }
    }

    /// <summary>Compresses an ushort array directly to a file.</summary>
    public static void Compress_File(ushort[] plainData, string outputFilePath)
    {
        using (var stream = new FileStream(outputFilePath, FileMode.OpenOrCreate, FileAccess.Write))
            Compress_File(plainData, stream);
    }

    /// <summary>Compresses an ushort array directly to a file stream.</summary>
    public static void Compress_File(ushort[] plainData, Stream outputStream)
    {
        using (var zip = new DeflateStream(outputStream, CompressionMode.Compress, true))
        {
            // Our temporary buffer.
            var buffer = new byte[BufferSize];

            // Writes the length of the plain data
            zip.Write(BitConverter.GetBytes(plainData.Length), 0, sizeof(int));

            var inputOffset = 0;
            var availableBytes = plainData.Length * sizeof(ushort);
            while (availableBytes > 0)
            {
                // Compute the amount of bytes to copy.
                var bytesCount = Math.Min(BufferSize, availableBytes);

                // Copy a chunk of plain data into the temporary buffer
                Buffer.BlockCopy(plainData, inputOffset, buffer, 0, bytesCount);

                // Write the buffer
                zip.Write(buffer, 0, bytesCount);

                // Advance counters
                inputOffset += bytesCount;
                availableBytes -= bytesCount;
            }
        }
    }

有GZipStream構造函數(Stream,CompressionLevel),您可以更改CompressionLevel以加快壓縮Fastest ,該枚舉中有一個級別表示“ Fastest

鏈接到相關文檔:
http://msdn.microsoft.com/pl-pl/library/hh137341(v=vs.110).aspx
http://msdn.microsoft.com/pl-pl/library/system.io.compression.compressionlevel(v=vs.110).aspx

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM