简体   繁体   English

为什么此base64解码/编码功能不能正常工作?

[英]Why isn't this base64 decode/encode function working properly?

I am using the base64 function from here to try and run a simple program. 我从这里开始使用base64函数尝试运行一个简单的程序。 I want to convert binary to base64 using this code: 我想使用以下代码将二进制文件转换为base64:

int x = 1;
std::string data = base64_encode(reinterpret_cast<unsigned char *>((void *)&x), 4);
std::string out = base64_decode(data);
int y = reinterpret_cast<int>(out.data());

The encode function is called, and generates this string "AQAAAA==" . 调用encode函数,并生成此字符串"AQAAAA==" From my understanding, this would should convert to 01 00 00 0 if decoded and converted to bytes (I actually don't understand why it's 7 bytes). 根据我的理解,如果将其解码并转换为字节,则应该转换为01 00 00 0 (我实际上不明白为什么是7字节)。 When the decode function is called, I expect 01 00 00 00 , which would then be reinterpret_castable back into the integer 1, but rather I get "\\001\\000\\000" , which is not what I expect. 调用解码函数时,我期望01 00 00 00 ,然后可以将reinterpret_castable返回整数1,但是我得到的不是"\\001\\000\\000" I tried increasing the second parameter of the encode function, funnily, it gives me the proper answer of "\\001\\000\\000\\000" after decode. 我尝试增加编码函数的第二个参数,很有趣,它在解码后给了我正确的答案"\\001\\000\\000\\000"

Assuming you are using this code for base64, I suspect the following will work: 假设您正在将此代码用于base64,我怀疑以下方法会起作用:

int x = 1;
std::string data = base64_encode(reinterpret_cast<unsigned char *>(&x), 4);
std::string out = base64_decode(data);
int y= *(reinterpret_cast<int*>(out.c_str()));

On whatever side is doing the decode logic, it should probably validate that out.size() == sizeof(int) before doing this conversion. 在执行解码逻辑的任何方面,它都应该在进行此转换之前验证out.size() == sizeof(int)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM