简体   繁体   中英

Java encrypt / decrypt via String not working because of padding

I have to encode a string using AES/ECB/PKCS5Padding. The encrypted result ( new String(encryptedResult) as they don't want bytes) is then sent to a partner. My partner then decrypt the string using getBytes().

Here is the decrypting method :

    public static String decrypter(final String donnees) throws NoSuchAlgorithmException, NoSuchPaddingException,
        InvalidKeyException, IllegalBlockSizeException, BadPaddingException {
    Cipher cipher = Cipher.getInstance("AES/ECB/PKCS5Padding");
    cipher.init(Cipher.DECRYPT_MODE, key);

    return new String(cipher.doFinal(donnees.getBytes()));
}

My problem is that I get this error when I try to decrypt : Input Length must be multiple of 16 when decrypting with padded cipher.

When I decode bytes directly it works just fine. How can I make the string.getBytes() not loose padding ? Or any other solutions ?

I cannot change the crypting algorythm, and the same can be said about the string and not bytes beeing sent to the partner.

A padding error generally means the decryption failed and failures can include a incorrect key, data and encodings. Incorrect decryption has a side-effect of also producing incorrect padding.

In this case it is an incorrect encoding of the encrypted data. If you need the encrypted data as a string the general method is to use Base64 or hexadecimal encoding.

This code is incorrect: new String(cipher.doFinal(donnees.getBytes()));

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM