简体   繁体   中英

ushort array compression in C#

I've got a ushort array (actually an image coming from a camera) that I'd like to lossless compress before persistent storage. I'm using the GZipStream function provided in System.IO.Compression.GZipStream. This approach, to my knowledge, requires that I convert the ushort array to a byte array. My solution appears to function properly, but just isn't as quick as I'd like. The images are about 2 Mbytes in raw size, and the compress time ranges (on my slow machine) 200 - 400 msecs, and decompress time ranges 100 - 200 msecs. Looking for suggestions for improving my performance.

public static class Zip
    {

        public static ushort[] Decompress_ByteToShort(byte[] zippedData)
        {
            byte[] decompressedData = null;            

            using (MemoryStream outputStream = new MemoryStream())
            {
                using (MemoryStream inputStream = new MemoryStream(zippedData))
                {
                    using (GZipStream zip = new GZipStream(inputStream, CompressionMode.Decompress))
                    {
                        zip.CopyTo(outputStream);
                    }
                }
                decompressedData = outputStream.ToArray();
            }

            ushort[] decompressShort = new ushort[decompressedData.Length / sizeof(ushort)];
            Buffer.BlockCopy(decompressedData, 0, decompressShort, 0, decompressedData.Length);

            return decompressShort;
        }



        public static byte[] Compress_ShortToByte(ushort[] plainData)
        {
            byte[] compressesData = null;

            byte[] uncompressedData = new byte[plainData.Length * sizeof(ushort)];
            Buffer.BlockCopy(plainData, 0, uncompressedData, 0, plainData.Length * sizeof(ushort));

            using (MemoryStream outputStream = new MemoryStream())
            {
                using (GZipStream zip = new GZipStream(outputStream, CompressionMode.Compress))
                {
                    zip.Write(uncompressedData, 0, uncompressedData.Length);
                }
                //Dont get the MemoryStream data before the GZipStream is closed 
                //since it doesn’t yet contain complete compressed data.
                //GZipStream writes additional data including footer information when its been disposed
                compressesData = outputStream.ToArray();
            }

            return compressesData;
        }

    }

The first problem in your approach I see is that you are using byte arrays instead of direcly loading and writing to files. Using a smaller temporary buffer and reading\\writing to streams and files directly in chunks should be much faster.

Here I propose some functions and overloads you can use to decompress from byte arrays, to byte arrays, from stream, to stream, from file and to file.

The performance improvement should be from 10% to 20%. Try to adjust the constants as needed. I used DeflateStream instead of GZipStream, this increases the performance a bit. You can go back to a GZipStream if you prefer.

I tried just the byte to ushort and ushort to byte[] version of the code and it is about 10% faster. Accessing directly to files instead of loading it to a big buffer should increase the performance even more.

WARNING: This approach of reading and writing images in this way is not little-endian/big-endian agnostic - it means that a file saved from a Intel/AMD machine is not compatible with an ARM machine, for example in some tablets! Just as a side note :)

    /// <summary>The average file size, used to preallocate the right amount of memory for compression.</summary>
    private const int AverageFileSize = 100000;

    /// <summary>The default size of the buffer used to convert data. WARNING: Must be a multiple of 2!</summary>
    private const int BufferSize = 32768;


    /// <summary>Decompresses a byte array to unsigned shorts.</summary>
    public static ushort[] Decompress_ByteToShort(byte[] zippedData)
    {
        using (var inputStream = new MemoryStream(zippedData))
            return Decompress_File(inputStream);
    }

    /// <summary>Decompresses a file to unsigned shorts.</summary>
    public static ushort[] Decompress_File(string inputFilePath)
    {
        using (var stream = new FileStream(inputFilePath, FileMode.Open, FileAccess.Read))
            return Decompress_File(stream);
    }

    /// <summary>Decompresses a file stream to unsigned shorts.</summary>
    public static ushort[] Decompress_File(Stream zippedData)
    {
        using (var zip = new DeflateStream(zippedData, CompressionMode.Decompress, true))
        {
            // Our temporary buffer.
            var buffer = new byte[BufferSize];

            // Read the number of bytes, written initially as header in the file.
            zip.Read(buffer, 0, sizeof(int));
            var resultLength = BitConverter.ToInt32(buffer, 0);

            // Creates the result array
            var result = new ushort[resultLength];

            // Decompress the file chunk by chunk
            var resultOffset = 0;
            for (; ; )
            {
                // Read a chunk of data
                var count = zip.Read(buffer, 0, BufferSize);
                if (count <= 0)
                    break;

                // Copy a piece of the decompressed buffer
                Buffer.BlockCopy(buffer, 0, result, resultOffset, count);

                // Advance counter
                resultOffset += count;
            }

            return result;
        }
    }

    /// <summary>Compresses an ushort array to a file array.</summary>
    public static byte[] Compress_ShortToByte(ushort[] plainData)
    {
        using (var outputStream = new MemoryStream(AverageFileSize))
        {
            Compress_File(plainData, outputStream);
            return outputStream.ToArray();
        }
    }

    /// <summary>Compresses an ushort array directly to a file.</summary>
    public static void Compress_File(ushort[] plainData, string outputFilePath)
    {
        using (var stream = new FileStream(outputFilePath, FileMode.OpenOrCreate, FileAccess.Write))
            Compress_File(plainData, stream);
    }

    /// <summary>Compresses an ushort array directly to a file stream.</summary>
    public static void Compress_File(ushort[] plainData, Stream outputStream)
    {
        using (var zip = new DeflateStream(outputStream, CompressionMode.Compress, true))
        {
            // Our temporary buffer.
            var buffer = new byte[BufferSize];

            // Writes the length of the plain data
            zip.Write(BitConverter.GetBytes(plainData.Length), 0, sizeof(int));

            var inputOffset = 0;
            var availableBytes = plainData.Length * sizeof(ushort);
            while (availableBytes > 0)
            {
                // Compute the amount of bytes to copy.
                var bytesCount = Math.Min(BufferSize, availableBytes);

                // Copy a chunk of plain data into the temporary buffer
                Buffer.BlockCopy(plainData, inputOffset, buffer, 0, bytesCount);

                // Write the buffer
                zip.Write(buffer, 0, bytesCount);

                // Advance counters
                inputOffset += bytesCount;
                availableBytes -= bytesCount;
            }
        }
    }

There is GZipStream constructor (Stream, CompressionLevel) you can change CompressionLevel to speed up compression there is level which say Fastest in this enumeration.

Links to relevant documentation:
http://msdn.microsoft.com/pl-pl/library/hh137341(v=vs.110).aspx
http://msdn.microsoft.com/pl-pl/library/system.io.compression.compressionlevel(v=vs.110).aspx

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM