简体   繁体   中英

Specified initialization vector (IV) does not match the block size in c# using AES

I'm starting with a provided example in Ruby:

cipher = OpenSSL::Cipher::AES.new(128, :CBC)
cipher.encrypt
cipher.key = "wB\x14=\r\xC3\xC1\x84$\x10\xCE\xC0\x10\x03\xFE\x18"
cipher.iv = "\xD8a\"\xFAs\xBD\xE4\xF9\xA4\xA1\x1E\xA5l\xA6@\xFD"

And trying to replicate in c#:

string AesKey = "wB\x14=\r\xC3\xC1\x84$\x10\xCE\xC0\x10\x03\xFE\x18";
string AesIV = "\xD8a\"\xFAs\xBD\xE4\xF9\xA4\xA1\x1E\xA5l\xA6@\xFD";

AesCryptoServiceProvider aes = new AesCryptoServiceProvider();
aes.BlockSize = 128;
aes.KeySize = 128;
aes.IV = Encoding.UTF8.GetBytes(AesIV);
aes.Key = Encoding.UTF8.GetBytes(AesKey);
aes.Mode = CipherMode.CBC;

byte[] src = Encoding.Unicode.GetBytes(text);
using (ICryptoTransform encrypt = aes.CreateEncryptor())
  {
      byte[] dest = encrypt.TransformFinalBlock(src, 0, src.Length);
      string EncryptedResult = Convert.ToBase64String(dest);
      EncryptedValue.Text = EncryptedResult;
  }

I'm getting the error:

"Specified initialization vector (IV) does not match the block size for this algorithm."

Am I misunderstanding something about the format of the original key and iv values that I am failing to account for?

Ruby and C# strings are different, so your C# IV winds up with extra bytes in it.

In Ruby, a string is really just a byte sequence. Each \\x is followed by two hexadecimal digits. So, \\xD8a is really a textual representation of the bytes 0xD8 0x61. No character encoding is necessary when this "string" is used as an initialization vector; it's already a byte sequence masquerading as text.

In C#, a string is a sequence of characters. To convert a string to a byte array, a character-encoding is used—you've chosen UTF-8 in this case. In order to represent millions of different characters, in C# each \\x escape is followed by 1 to 4 hexadecimal digits. So, for example, the substring \\xD8a doesn't represent two bytes, but the single character (U+0D8A), and when you encode it with UTF-8, it translates to the 3-byte sequence 0xE0 0xB6 0x8A instead of the two-byte 0xD8 0x61 Ruby-equivalent.

You can base-64–encode the IV if you need a printable form. Or you can write it in C# as a byte array literal:

aes.IV = new byte[] { 0xD8, 0x61, ... };

However, the point is probably moot, since a real application requires security, and hard-coded IVs are not secure.

Make sure you are using the same padding technique, such as PKCS5 padding or just pure zero's. I had an issue with this a few weeks ago in php/java/c#, and it was the padding that was causing the issue.

The data is stored in blocks, and the last block typically won't be the exact size of the others, so it has to padd the end of it to make it the correct block size. In C# you can easily set it, but in ruby you may have to write the code yourself to append the padding.

Edit: It looks like Ruby may use PKCS5 by default as it seems to be here , so just set the padding in C# and see what happens.

aes.Padding = PaddingMode.PKCS5;

Also try base64 encoding the data before you hand it to the algorithm, and then base64 decoding the data you get from the decrypt method. I had to do this as well, otherwise the data from the decrypt would sometimes be corrupted.

I also would recommend using a 256 bit key since it is more secure, and just as easy to use as a 128 bit key.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM