简体   繁体   中英

ARM cortex-M3 uint_fast32_t vs uint32_t

I am developing a program for an STM32Fx cortex-M3 series processor. In stdint.h the following are defined:

typedef unsigned int uint_fast32_t;
typedef uint32_t  uint_least32_t;
typedef unsigned long uint32_t;

As I understand it.

[u]int_fast[n]_t will give you the fastest data type of at least n bits.
[u]int_least[n]_t will give you the smallest data type of at least n bits.
[u]int[n]_t will give you the data type of exactly n bits.

Also as far as i know sizeof(unsigned int) <= sizeof(unsigned long) and UINT_MAX <= ULONG_MAX - always.

Thus I would expect uint_fast32_t to be a data type with a size equal to or greater than the size of uint32_t.

In the case of the cortex-M3 sizeof(unsigned int) == sizeof(unsigned long) == 4. So the above definitions are 'correct' in terms of size.

But why are they not defined in a way that is consistent with the names and logical sizes of the underlying data types ie

typedef unsigned long uint_fast32_t;
typedef unsigned int  uint_least32_t;
typedef uint_fast32_t uint32_t;

Can someone please clarify the selection of the underlying types?

Given that 'long' and 'int' are the same size, why not use the same data type for all three definitions?

typedef unsigned int uint_fast32_t;
typedef unsigned int uint_least32_t;
typedef unsigned int uint32_t;

The case is, it is only guaranteed that

sizeof(long) >= sizeof(int)

and it is not guaranteed that it is actually any longer. On a lot of systems, int is usually as big as long.

See my answer to your other question .

Basically, it doesn't matter which type is used. Given that int and long are the same size and have the same representation and other characteristics, the implementer can choose either type for int32_t , int_fast32_t , and int_least32_t , and likewise for the corresponding unsigned versions.

(It's possible that the particular choices could be influenced by a perceived need to use the same header for implementations with different sizes for int and long , but I don't see how the particular definitions you quoted would achieve that.)

As long as the types are the right size and meet all the other requirements imposed by the standard, and as long as you don't write code that depends on, for example, int32_t being compatible with int , or with long , it doesn't matter.

The particular choices made were likely an arbitrary whim of the implementer -- which is perfectly acceptable. Or perhaps that header file was modified by two or more developers who had different ideas about which type is best.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM