[英]HMAC SHA256 in C++ (DynamoDB)
I'm trying to connect to DynamoDB through the REST Web API and it requires me to generate a signature using HMAC-SHA256.我正在尝试通过 REST Web API 连接到 DynamoDB,它要求我使用 HMAC-SHA256 生成签名。 I've got SHA-256 working, but I cant seem to get HMAC working, here is the C++ code (using OpenSSL)
我有 SHA-256 工作,但我似乎无法让 HMAC 工作,这是 C++ 代码(使用 OpenSSL)
string hmac(string key, string msg)
{
unsigned char hash[32];
HMAC_CTX hmac;
HMAC_CTX_init(&hmac);
HMAC_Init_ex(&hmac, &key[0], key.length(), EVP_sha256(), NULL);
HMAC_Update(&hmac, (unsigned char*) &msg[0], msg.length());
unsigned int len = 32;
HMAC_Final(&hmac, hash, &len);
HMAC_CTX_cleanup(&hmac);
stringstream ss;
for (int i = 0; i < len; i++)
{
ss << hex << ( unsigned int )hash[i];
}
return ss.str();
}
Here is the call to hmac这是对 hmac 的调用
/*********************************************CALCULATE SIGNATURE****************************************************************/
string AWS4 = "AWS4" + secretKey;
string Kdate = hmac(AWS4.data(), dateStamp);
string Kregion = hmac(Kdate.data(), region);
string Kservice = hmac(Kregion.data(), service);
string signingkey = hmac(Kservice.data(), "aws4_request");
string signature = hmac(signingkey.data(), stringToSign);
string authoritzationHeader = algorithm + " Credential=" + accessKey + "/" + credential_scope + ", SignedHeaders=" + signedHeaders + ", Signature=" + signature;
This is the Python code I'm basing it off:这是我基于它的 Python 代码:
def sign(key, msg):
return hmac.new(key, msg.encode("utf-8"), hashlib.sha256).digest()
def getSignatureKey(key, date_stamp, regionName, serviceName):
kDate = sign(('AWS4' + key).encode('utf-8'), date_stamp)
kRegion = sign(kDate, regionName)
kService = sign(kRegion, serviceName)
kSigning = sign(kService, 'aws4_request')
print 'Kdate: ' + kDate
print 'Kregion: ' + kRegion
print 'Kservice: ' + kService
return kSigning
Given the same values they produce a different result.给定相同的值,它们会产生不同的结果。 Can anyone help me as to why this is?
谁能帮我解释一下这是为什么? Thanks.
谢谢。
The issue is that DynamoDB calculates hmac in two different ways.问题在于 DynamoDB 以两种不同的方式计算 hmac。 The first returns a string representation and the second returns a hex representation
第一个返回字符串表示,第二个返回十六进制表示
The hex implementation十六进制实现
string hmacHex(string key, string msg)
{
unsigned char hash[32];
HMAC_CTX hmac;
HMAC_CTX_init(&hmac);
HMAC_Init_ex(&hmac, &key[0], key.length(), EVP_sha256(), NULL);
HMAC_Update(&hmac, (unsigned char*)&msg[0], msg.length());
unsigned int len = 32;
HMAC_Final(&hmac, hash, &len);
HMAC_CTX_cleanup(&hmac);
std::stringstream ss;
ss << std::hex << std::setfill('0');
for (int i = 0; i < len; i++)
{
ss << std::hex << std::setw(2) << (unsigned int)hash[i];
}
return (ss.str());
}
the string implementation字符串实现
string hmac(string key, string msg)
{
unsigned char hash[32];
HMAC_CTX hmac;
HMAC_CTX_init(&hmac);
HMAC_Init_ex(&hmac, &key[0], key.length(), EVP_sha256(), NULL);
HMAC_Update(&hmac, ( unsigned char* )&msg[0], msg.length());
unsigned int len = 32;
HMAC_Final(&hmac, hash, &len);
HMAC_CTX_cleanup(&hmac);
std::stringstream ss;
ss << std::setfill('0');
for (int i = 0; i < len; i++)
{
ss << hash[i];
}
return (ss.str());
}
Amazon uses the hex implementation for all date, region, service and signing key. Amazon 对所有日期、区域、服务和签名密钥使用十六进制实现。 The string implementation is only used for the signature
字符串实现仅用于签名
Mike's answer has a bug.迈克的回答有一个错误。 Don't use std::strings
.length()
to find the length of the key when dealing with binary data.在处理二进制数据时,不要使用 std::strings
.length()
来查找键的长度。 As binary data can have null character before the true end of the data.由于二进制数据在数据的真正结束之前可以有空字符。 Either take in a
char
array and length as parameters for both key and msg.将
char
数组和长度作为键和 msg 的参数。 OR if you are using C++11, you can use vector
to store the binary data.或者,如果您使用的是 C++11,则可以使用
vector
来存储二进制数据。
Following is a partial implementation of Mike's answer with vectors as parameters-以下是 Mike 以向量作为参数的答案的部分实现 -
std::vector<uint8_t>
HMAC_SHA256(const std::vector<uint8_t>& key
,const std::vector<uint8_t>& value)
{
unsigned int len = SHA256_DIGEST_LENGTH;
unsigned char hash[SHA256_DIGEST_LENGTH];
size_t keyLen = key.size();
size_t valueLen = value.size();
HMAC_CTX hmac;
HMAC_CTX_init(&hmac);
HMAC_Init_ex(&hmac, (unsigned char*)key.data(), keyLen, EVP_sha256(), NULL);
HMAC_Update(&hmac, (unsigned char*)value.data(), valueLen);
HMAC_Final(&hmac, hash, &len);
HMAC_CTX_cleanup(&hmac);
return std::vector<uint8_t>((uint8_t*)hash,(uint8_t*)hash+SHA256_DIGEST_LENGTH);
}
static int C_hmac(){
/**
* man 3 HMAC
* */
std::string data ="your's data" ;
std::string key = "your's key";
unsigned int hash_sz = EVP_MAX_MD_SIZE;
HMAC_CTX ctx;
HMAC_CTX_init(&ctx);
unsigned char* digest = HMAC(EVP_sha256(), key.c_str(), key.size(), (unsigned char*)data.c_str(), data.size(), NULL, &hash_sz);
std::stringstream ss;
ss<< std::setfill('0');
for(int i=0;i< hash_sz ;++i){
ss << std::hex << std::setw(2) << (unsigned int) digest[i];
}
std::string final_hash = ss.str();
return 1;
}
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.