简体   繁体   中英

converting an int to char*

This is a very very basic question and I know one way is to do the following:

char buffer[33];
itoa(aq_width, buffer,10);

where aq_width is the int, but then I can't guarantee what size of buffer I would need in order to do this... I can always allocate a very large buffer size, but that wouldn't be very nice... any other pretty and simple way to do this?

std::stringstream ss;
ss << 3;
std::string s = ss.str();
assert(!strcmp("3", s.c_str()));

You can calculate an upper-bound on the required size of the buffer using this macro (for signed types):

#define MAX_SIZE(type) ((CHAR_BIT * sizeof(type) - 1) / 3 + 2)

By the way, itoa() isn't standard C and isn't available everywhere. snprintf() will do the job:

char buffer[MAX_SIZE(aq_width)];
snprintf(buffer, sizeof buffer, "%d", aq_width);

A nice function from glibc is asprintf , used like

char * buffer = NULL;
int buffer_size = asprintf(&buffer,"%d",aq_width);
assert(buffer_size >= 0);
...
free(buffer)

which will allocate the buffer for you.

More portably you can use snprintf , which will return the number of characters you need to allocate.

There are more memory efficient ways of doing it, but they aren't "simple".

Consider the cost of 100 bytes of memory for, say, one second. A gigabyte for lifetime of ten years costs, what, $10? We're talking nanocents here.

Make the buffer too big. Note that a 32-bit int can't be more than 10 digits in decimal, and a 64-bit int, can't be more than 20.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM