简体   繁体   中英

Converting an array of chars to std::string in order to pass into std::bitset seg fault

before you down vote this please read carefully, it does get interesting. Basically I want to convert a type char array into an std::string in order to use std::bitset operations but when I try to create the bitset object at runtime I get this error.

terminate called after throwing an instance of 'std::invalid_argument' what(): bitset::_M_copy_from_ptr Aborted (core dumped)

Here's the code

#include <iostream>
#include <cstdlib>
#include <bitset>

int main()
{
    char BYTE_4[4] = { 1, 0, 0, 0};

    std::string str_BYTE_4 = std::string(BYTE_4);

    std::bitset<32> str_BYTE_4_bit( str_BYTE_4);//crash here
    std::cout<<"str_BYTE_4_bit. "<<str_BYTE_4_bit<<std::endl;

    return 0;
}

I also tried some other types of conversion with std::stringstream and pointers of both char and std::string and no matter what I pass into that std::bitset constructor I get the same error?

These are just snippets I commented out and removed from the above code, to show what I tried.

//char* BYTE_4 = new char[4];
    //std::stringstream SS;

    //std::string str_BYTE_4 = "0101";
    //SS << BYTE_4;
        //str_BYTE_4 = SS.str();
    //for(int index = 0; index < 4; index++)
        //    str_BYTE_4 += BYTE_4[index];

    //std::string *str_BYTE_4 = new std::string[4];
    //for( int index = 0; index < 4; index++)
        //    BYTE_4[index] = rand()%255;

This is wrong:

char BYTE_4[4] = { 1, 0, 0, 0};
std::string str_BYTE_4 = std::string(BYTE_4);

What you need is a string of digits, but you are storing raw bytes 1 and 0 (not ASCII "1" and "0"). Fix it like this:

char BYTE_4[4] = { '1', '0', '0', '0'};
std::string str_BYTE_4 = std::string(BYTE_4, sizeof(BYTE_4));

Since there is no null terminator, you must tell the std::string constructor where to stop (by passing 4 as the second argument).

An even easier way would be:

std::string str_BYTE_4 = "1000";

As for the invalid_argument exception you got, you will see if you read the documentation for bitset that it means you passed a string which contained a character that was neither '0' nor '1' (those being ASCII characters, whose raw integer values are 48 and 49).

The std::string constructed from

char BYTE_4[4] = { 1, 0, 0, 0};

is no different than the the std::string constructed from

char BYTE_4[4] = { 1, '\0', '\0', '\0'};

You only have the char represented by the integer value 1 in the std:string . That is the source of the problem.

In order to be able to construct a std::bitset from a std::string , you need the std::string to contain only the characters '1' or '0' . Hence you need to use the charactes '1' and '0' , not the integer values 1 and 0 .

You can use:

char BYTE_4[] = {'1', '0', '0', '0', '\0'};
std::string str_BYTE_4 = std::string(BYTE_4);

or

char BYTE_4[4] = {'1', '0', '0', '0'};
std::string str_BYTE_4 = std::string(BYTE_4, 4);

in order to be able to construct a std::bitset from the std::string .

For what it's worth:

std::bitset<32> str_BYTE_4_bit(std::string());

creates a bitset whose value consists of 32 zero bits.

std::bitset<32> str_BYTE_4_bit(std::string("1000"));

creates a bitset whose value consists of 28 leading bits that are zero and the last 4 bits are 1000.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM