简体   繁体   中英

Difference between int and signed int declaration

I am reading some tutorials on embedded programming and one of them says int and signed int are different but does not explain how or why.

I understand why unsigned int and int are different but int and signed int being different is a new one for me.

It is for historical reasons only. Today whenever you declare int you get a signed int . The only point where you might see a difference even with today's compilers is with char versus signed char which are different by specification (and notable when assigning a literal string) but not with int .

As far as I know the difference exists only for char data type. Where char a; can be signed char a; or unsigned char a; depending on compiler options. As this article says. (--signed_chars) For int data types, there is no difference between int and signed int .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM