[英]AES Encryption between .NET WinRT and iOS Obj-C isn't the same
我很難將我的iOS AES加密/解密與WinRT上的相同。
我無法在WinRT方面更改實現,因為它已在已發布的應用程序中使用。
這是我用零鍵和零iv制作的兩個樣本,輸出是不同的。
C#示例代碼:
using (MemoryStream saveDataMemoryStreamCrypto = new MemoryStream())
{
var saveDataKeyProvider = Windows.Security.Cryptography.Core.SymmetricKeyAlgorithmProvider.OpenAlgorithm(Windows.Security.Cryptography.Core.SymmetricAlgorithmNames.AesCbcPkcs7);
var saveDataKeyBuffer = Windows.Security.Cryptography.CryptographicBuffer.CreateFromByteArray(new byte[32]);
var saveDataKey = saveDataKeyProvider.CreateSymmetricKey(saveDataKeyBuffer);
var saveDataSaltBuffer = Windows.Security.Cryptography.CryptographicBuffer.CreateFromByteArray(new byte[32]);
var saveDataDataBuffer = Windows.Security.Cryptography.CryptographicBuffer.ConvertStringToBinary("ABCDEFGH", Windows.Security.Cryptography.BinaryStringEncoding.Utf16BE);
var saveDataOutBuffer = Windows.Security.Cryptography.Core.CryptographicEngine.Encrypt(saveDataKey, saveDataDataBuffer, saveDataSaltBuffer);
var saveDataOutBytes = saveDataOutBuffer.ToArray();
}
C#字節輸出:
80 87 109 195 133 40 205 81 117 91 17 132 229 3 119 251 205 8 246 64 13 57 210 142 11 153 121 39 122 196 63 10
Obj-C示例代碼:
Byte keyPtr[32];
bzero(keyPtr, sizeof(keyPtr));
Byte ivPtr[32];
bzero(ivPtr, sizeof(ivPtr));
NSString *text = @"ABCDEFGH";
NSUInteger dataLength;
void * buffer = malloc([text length]);
[text getBytes:buffer maxLength:[text length] usedLength:&dataLength encoding:NSUTF16BigEndianStringEncoding options:0 range:NSMakeRange(0, dataLength) remainingRange:nil];
size_t bufferSize = dataLength * kCCBlockSizeAES128;
void * bufferOut = malloc(bufferSize);
size_t numBytesEncrypted = 0;
CCCryptorStatus cryptStatus = CCCrypt(kCCEncrypt, kCCAlgorithmAES128, kCCOptionPKCS7Padding, keyPtr, kCCKeySizeAES256, ivPtr, buffer, dataLength, bufferOut, bufferSize, &numBytesEncrypted);
Obj-C字節輸出:
23 144 186 234 149 182 123 79 155 234 250 54 52 38 151 87 179 62 176 1 203 115 59 1 35 54 176 1 44 213 120 1
有人知道我在做什么錯嗎?
謝謝,格雷格
這是工作代碼,主要區別在於UTF16的數據長度為每個字符兩個字節:
u_int8_t keyPtr[32];
bzero(keyPtr, sizeof(keyPtr));
u_int8_t ivPtr[32];
bzero(ivPtr, sizeof(ivPtr));
NSString *text = @"ABCDEFGH";
NSUInteger dataLength = [text length] * 2; // Allow for utf16
void * buffer = malloc(dataLength);
[text getBytes:buffer maxLength:dataLength usedLength:nil encoding:NSUTF16BigEndianStringEncoding options:0 range:NSMakeRange(0, dataLength) remainingRange:nil];
size_t bufferSize = dataLength * kCCBlockSizeAES128;
u_int8_t * bufferOut = malloc(bufferSize);
size_t numBytesEncrypted = 0;
CCCryptorStatus cryptStatus = CCCrypt(kCCEncrypt, kCCAlgorithmAES128, kCCOptionPKCS7Padding, keyPtr, kCCKeySizeAES256, ivPtr, buffer, dataLength, bufferOut, bufferSize, &numBytesEncrypted);
printf("encoded text in decimal: ");
for (int i=0; i<numBytesEncrypted; i++) {
printf("%d ", bufferOut[i]);
}
printf("\n");
Printf輸出:
十進制編碼的文本:80 87 109 195 133 40 205 81 117 91 17 132 229 3 119 251 205 8 246 64 13 57 210 142 11 153 121 39 122 196 63 10
是的,這真是糟糕的代碼,我做了必要的最小改動。 我猜正在進入一個新時代,數據轉儲以十進制表示,十六進制已死。
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.