简体   繁体   中英

HMAC SHA256 JWT Signature is incorrect

The answer:

Instead of the hmac func below here are the functions I am now using...

func base64Encoded(algorithm: CryptoAlgorithm, key: String) -> String {
  let hmac = self.hmac(algorithm: algorithm, key: key)
  let digestLen = algorithm.digestLength
  let dataResult = NSData(bytes: hmac, length: digestLen)
  hmac.deallocateCapacity(digestLen)

  return dataResult.base64EncodedString()
}

func hash(algorithm: CryptoAlgorithm, key: String) -> String {
  let hmac = self.hmac(algorithm: algorithm, key: key)
  let digestLen = algorithm.digestLength
  let hash = NSMutableString()

  for i in 0..<digestLen {
    hash.appendFormat("%02x", hmac[i])
  }

  hmac.deallocateCapacity(digestLen)

  return hash as String 
}

func hmac(algorithm: CryptoAlgorithm, key: String) -> UnsafeMutablePointer<CUnsignedChar> {
  let str = self.cString(using: String.Encoding.utf8)
  let strLen = Int(self.lengthOfBytes(using: String.Encoding.utf8))
  let digestLen = algorithm.digestLength

  let result = UnsafeMutablePointer<CUnsignedChar>(allocatingCapacity: digestLen)

  let keyStr = key.cString(using: String.Encoding.utf8)
  let keyLen = Int(key.lengthOfBytes(using: String.Encoding.utf8))

  CCHmac(algorithm.HMACAlgorithm, keyStr!, keyLen, str!, strLen, result)

  return result
}

ORIGINAL POST

I have a JWT which I am trying to verify the signature. Here is the JWT...

eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJhcGkudGVzdC5jb20vdjEvYXV0aCIsImV4cCI6MTQ2OTk3ODQ5OCwic3ViIjoiMTIzNDU2Nzg5MCIsImVtYWlsIjoidGVzdEB0ZXN0LmNvbSIsInJvbGVzIjpbImFkbWluIiwiY3VzdG9tZXIiXSwicGVybWlzc2lvbnMiOlsidGVzdC5wcm9maWxlIiwidGVzdC5wcm9maWxlLmNvbnRhY3QiLCJ0ZXN0LnByb2ZpbGUuZGV2aWNlIiwidGVzdC5wcm9maWxlLmFwcCJdfQ.GfLxXOL978Pm5GYMI0WTBEVcMrfVj2jJb-Il_XzO7g4

I'm working in Swift 3 and I updated the methods in this SO answer https://stackoverflow.com/a/24411522/741626 . This is what those methods now look like.

import Foundation

enum CryptoAlgorithm {
  case MD5, SHA1, SHA224, SHA256, SHA384, SHA512

    var HMACAlgorithm: CCHmacAlgorithm {
    var result: Int = 0
    switch self {
    case .MD5:      result = kCCHmacAlgMD5
    case .SHA1:     result = kCCHmacAlgSHA1
    case .SHA224:   result = kCCHmacAlgSHA224
    case .SHA256:   result = kCCHmacAlgSHA256
    case .SHA384:   result = kCCHmacAlgSHA384
    case .SHA512:   result = kCCHmacAlgSHA512
    }
  return CCHmacAlgorithm(result)
}

var digestLength: Int {
  var result: Int32 = 0
    switch self {
    case .MD5:      result = CC_MD5_DIGEST_LENGTH
    case .SHA1:     result = CC_SHA1_DIGEST_LENGTH
    case .SHA224:   result = CC_SHA224_DIGEST_LENGTH
    case .SHA256:   result = CC_SHA256_DIGEST_LENGTH
    case .SHA384:   result = CC_SHA384_DIGEST_LENGTH
    case .SHA512:   result = CC_SHA512_DIGEST_LENGTH
    }
    return Int(result)
  }
}

extension String {

func hmac(algorithm: CryptoAlgorithm, key: String) -> String {
  let str = self.cString(using: String.Encoding.utf8)
  let strLen = Int(self.lengthOfBytes(using: String.Encoding.utf8))
  let digestLen = algorithm.digestLength

  let result = UnsafeMutablePointer<CUnsignedChar>(allocatingCapacity: digestLen)

  let keyStr = key.cString(using: String.Encoding.utf8)
  let keyLen = Int(key.lengthOfBytes(using: String.Encoding.utf8))

  CCHmac(algorithm.HMACAlgorithm, keyStr!, keyLen, str!, strLen, result)

  let hash = NSMutableString()
  for i in 0..<digestLen {
    hash.appendFormat("%02x", result[i])
  }

  result.deallocateCapacity(digestLen)

  return hash as String

}

I am able to base64Decode the header & payload successfully but when I try to verify the signature it's always wrong (doesn't look anything like AND is way too long).

What I've tried.

1 I've tried several JWTs - always wrong in the same way

2 I've hard coded the sha256 constants to ensure I wasn't using the wrong encoding / length

3 I've tried many types of String.Encoding but while they always generate a different result as expected, none of them generate the desired signature.

4 I've used an Objective-C method to generate the hmac to try and rule out if my conversion to Swift 3 broke anything. Same results, here is the Objective C code.

+ (NSData *)hmacSha256:(NSString *)string key:(NSString *)key;
{
  NSData *dataIn = [string dataUsingEncoding:NSUTF8StringEncoding];
  NSData *keyIn = [key dataUsingEncoding:NSUTF8StringEncoding];
  NSMutableData *macOut = [NSMutableData dataWithLength:CC_SHA256_DIGEST_LENGTH];

  CCHmac( kCCHmacAlgSHA256,
       keyIn.bytes,
       keyIn.length,
       dataIn.bytes,
       dataIn.length,
       macOut.mutableBytes);

  return macOut;
}

The code originated here: https://stackoverflow.com/a/31003443/741626 . I'd rather not plug in Objective-C code to my Swift project but if I have to I will!

Here is my calling func

func decodeToken(token: String) {
  let array = token.characters.split(isSeparator: { $0 == "." })
.map(String.init)

  let header:String = String(array[0])
  let payload:String = String(array[1])
  let signature:String = String(array[2])

  let encodedString = header + "." + payload
  let hmac = encodedString.hmac(algorithm: .SHA256, key: "")
}

EDIT

When I run the code the resulting hmac is

19f2f15ce2fdefc3e6e4660c23459304455c32b7d58f68c96fe225fd7cceee0e

I've triple checked that the secret is correct, which is "" (empty string)

What am I doing wrong?

I would really like to test in my client app that the tokens I receive are trustworthy rather than glossing over this. If anyone has any ideas what I'm doing wrong that would be great.

Actually the value that you've generated is correct. It's just that you're looking at the hexadecimal representation of the HMAC value (and thus hash function), while the JWT hash is of course base64url encoded.

To verify the hash it is important to compare the byte values instead of the encoded values. The encoded values are just required for human consumption (hexadecimals) or transport over protocols that require text (base64url).


When performing the byte array comparison, please make sure that it is time constant or you may introduce vulnerabilities.

One way is to XOR the bytes and then OR the result into a result byte. Then test that result byte against 00h (any other value indicates an invalid comparison). Reject incorrectly sized arrays first.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM