简体   繁体   中英

Powershell Core deserializes numbers in JSON as Int64 vs Windows Powershell which does it as Int32

Please observe:

Windows Powershell

C:\> ("1" | ConvertFrom-Json).gettype().name
Int32
C:\>

Powershell Core

C:\> ("1" | ConvertFrom-Json).gettype().name
Int64
C:\>

This is not benign. Consider a map with keys being integers:

$x = @{
    123 = 1
}

The key 123 is an Int32 , not Int64 . So, if 123 comes from parsed JSON it would be of different types in different shells. Now:

C:\> $x[[Int32]123]
1
C:\> $x[[Int64]123]
C:\>

And this is true on both shells. This change in behavior wrecks havoc in our automation scripts that manipulate things using REST APIs.

Can this behavior of Powershell Core be turned off?

  • The two PowerShell editions use different implementations , causing the divergent behavior you observed:

    • Windows PowerShell uses a custom implementation, whereas PowerShell [Core] v6+ , as of v7.1, uses the Json.NET library behind the scenes; see this answer for that library's rationale for deserializing to System.Int64 ( [long] ) by default.
  • As of PowerShell 7.1, there's a planned move to the now native .NET JSON functionality (available in .NET Core 3+) available via the System.Text.Json namespace , which may restore serializing to System.Int32 ( [int] ) by default , given that breaking changes are inevitable anyway:

    • See the discussion in GitHub issue #14264 (created by iRon based on this question) and GitHub PR #11198 , which is preparing the move to System.Text.Json .

    • A related problem is that numbers too large to fit into a System.Int64 ( [long] ) are also serialized differently (see GitHub issue #9207 ):

      • Windows PowerShell: first chooses System.Decimal ( [decimal] ) and for even larger numbers System.Double ( [double] ).
      • PowerShell [Core] as of v7.1: always chooses System.BigInt ( [bigint] ).
    • Also, there are differences with respect to the number formats supported:

      • Windows PowerShell does not recognize hexadecimal numbers (eg, 0x10 ), in accordance with the JSON spec [1] , whereas PowerShell [Core] as of v7.1 does ; however, as another extension to the spec, both support scientific notation (eg, 1e2 for 100 ) and parse it as [double] .

      • Windows PowerShell, as another extension to the spec, does support + -prefixed numbers (eg, +10 ), whereas PowerShell [Core] as of v7.1 does not .

      • (Also, both editions support single -quoted strings as an extension.)


Workaround :

  • Generally, note that the problem may often not surface, given PowerShell's ability to mix different numeric types and widen types on demand.

  • However, as the question shows, when numbers are used as the keys of a hashtable (dictionary) , a type-exact value must be passed for lookup in order to locate entries.

Therefore, the simplest workaround is to cast the hashtable key to [int] , which allows later lookups with just, say, [123] (or even .123 ):

# Works in both Windows PowerShell and PowerShell [Core]
# Without the [int] cast, the lookup would fail in PowerShell [Core] as of v7.1
PS> $key = '123' | ConvertFrom-Json; @{ [int] $key = 'bingo' }[123]
bingo

Another option is to use a [pscustomobject] rather than a hashtable, in which case the numeric keys implicitly become property names , which are always strings .

# Note the use of .123, i.e. property access.
PS> $key = '123' | ConvertFrom-Json; ([pscustomobject] @{ [int] $key = 'bingo' }).123
bingo

This would work even with a numeric variable:

$lookup=123; ([pscustomobject] @{ [int] $key = 'bingo' }).$lookup

Caveat : When a key is implicitly stringified to become a property name, it is invariably a decimal representation that is used; eg, [pscustomobject] @{ [int] 0x10 = 'bingo' } results in an object whose property name is '16' . [2]

However, note that hashtables / dictionaries are more lightweight than [pscustomobject] s.


[1] However, the JSON5 format, which is meant to improve on JSON, does support hex. numbers, along with other notable improvements, such as support for comments, extraneous trailing commas, and single -quoted strings.

[2] Also, with [double] values the conversion is culture-sensitive , so that 1.2 can in some cultures result in '1.2' (as of v7.1, which is unexpected - see GitHub issue #14278 ); also, large [double] s can end up in scientific notation, so that 1000000000000000.1 results in '1E+15' . That said, using [double] s as dictionary keys is generally ill-advised, given the accuracy limits of its from-decimal-number conversion.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM