Skip to content

[SR-5640] JSONEncoder misrepresents UInt.max on Linux #4417

Closed
@swift-ci

Description

@swift-ci
Previous ID SR-5640
Radar None
Original Reporter djones6 (JIRA User)
Type Bug
Status Resolved
Resolution Done
Environment

Ubuntu 16.04, DEVELOPMENT-SNAPSHOT-2017-08-03-a

Additional Detail from JIRA
Votes 0
Component/s Foundation
Labels Bug
Assignee @spevans
Priority Medium

md5: 6cedaf4523ec8dec2eec658df2816060

Issue Description:

JSONEncoder on Linux fails to correctly encode UInt values that are greater than Int.max. For example:

import Foundation

let encoder = JSONEncoder()

struct MyValue: Codable {
  let intMin:Int
  let intMax:Int
  let uintMin:UInt
  let uintMax:UInt

  init() {
    intMin = Int.min
    intMax = Int.max
    uintMin = UInt.min
    uintMax = UInt.max
  }
}

let myValue = MyValue()
let myDict: [String:Any] = ["intMin": myValue.intMin, "intMax": myValue.intMax, "uintMin": myValue.uintMin, "uintMax": myValue.uintMax]

let result = try encoder.encode(myValue)
print("Result (JSONEncoder): \(String(data: result, encoding: .utf8) ?? "nil")")

let result2 = try JSONSerialization.data(withJSONObject: myDict)
print("Result (JSONSerialization): \(String(data: result2, encoding: .utf8) ?? "nil")")

On Ubuntu, this produces a value of -1 for UInt.max:

Result (JSONEncoder): {"uintMin":0,"intMin":-9223372036854775808,"uintMax":-1,"intMax":9223372036854775807}
Result (JSONSerialization): {"uintMin":0,"intMin":-9223372036854775808,"uintMax":18446744073709551615,"intMax":9223372036854775807}

On OSX, both methods produce the same (correct) output.

I suspect this is something to do with the boxing of UInt values to NSNumber on Linux - it appears the bytes are being interpreted as signed rather than unsigned.

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions