简体   繁体   中英

What is the difference between Int and Int32 in Swift?

In Core Data you can store Int16 , Int32 , Int64 but it is different from Int . What is the reason for their existence, how do you use them?

According to the Swift Documentation

Int

In most cases, you don't need to pick a specific size of integer to use in your code. Swift provides an additional integer type, Int, which has the same size as the current platform's native word size:

On a 32-bit platform, Int is the same size as Int32.

On a 64-bit platform, Int is the same size as Int64.

Unless you need to work with a specific size of integer, always use Int for integer values in your code. This aids code consistency and interoperability. Even on 32-bit platforms, Int can store any value between -2,147,483,648 and 2,147,483,647, and is large enough for many integer ranges.

The number placed after the "Int" is a reference to how many bits it uses. When you are using an int for something really short or almost unnecessary, like a running count of how many objects in a list that follow a specific parameter, I normally use an UInt8, which is an integer that has a maximum value of 256 (2^8), and a minimum value of 0, because it is unsigned (this is the difference between UInt8 and Int8). If it is signed, (or doesn't have a "U" prefixing the variable) then it can be negative. The same is true for Int16, Int32, and Int64. The bonus for using a smaller sized Int type is not very large, so you don't really need to use these if you don't want to.

There is no performance savings if you are running on a laptop or iOS device with a 32 bit or 64 bit processor. Just use Int. The CPU doesn't just use 8 bits when you use Int8, the CPU will use its entire bit width regardless of what you use, because the hardware is already in the chip...

Now if you have an 8 bit CPU, then using Int32 would require the compiler to do a bunch of backflips and magic tricks to get the 32 bit Int to work

Swift's Int is a wrapper because it has the same size as a platform capacity ( Int32 on 32-bit platform and Int64 on 64-bit platform).

As a programmer, you should not declare a platform dependent Int (eg Int32, Int64) unless you really need it. For example, when you are working on 32-bit platform with numbers that cannot be represented using 4 bytes, then you can declare Int64 instead of Int .

  • Int32 : 4 bytes: from −2147483648 to +2147483647
  • Int64 : 8 bytes: from −9223372036854775808 to +9223372036854775807

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM