[sparkplug-payload] returns UInt32 values as BigInt, but JS number primitive can represent all UInt32 values
getValue and setValue should use the int_value protobuf field for UInt32, not long_value.
https://2ality.com/2012/02/js-integers.html
In the sparkplug spec, it says that Metric.DataType = Uint32 should be transmitted as google protocol Buffer Type: uint32 This means that this bug contradicts the sparkplug spec. It also cannot parse correctly formed UInt32 Metrics. one relevant code location for java encoder: https://github.com/eclipse/tahu/blob/e86127f00f1c6641e910eac734d5c4790fb082a8/client_libraries/java/src/main/java/org/eclipse/tahu/message/SparkplugBPayloadEncoder.java#L327 one relevant code location for java decoder: https://github.com/eclipse/tahu/blob/e86127f00f1c6641e910eac734d5c4790fb082a8/client_libraries/java/src/main/java/org/eclipse/tahu/message/SparkplugBPayloadDecoder.java#L156 there are more areas
Relevant section of sparkplugB spec v2.2 15.2.1 "UInt32 ▪ Unsigned 32-bit integer ▪ Google Protocol Buffer Type: uint32 ▪ Sparkplug™ enum value: 7"
There is not a good workaround, for an encoder to support both the spec and the (current) buggy version of java Tahu at the same time. It might be wise to have a setting somewhere to switch between following the standard and communicating to older implementations.
This is because the int_value and long_value are google protobuf "oneof" and you can only have 1 or none of these values.
That's a good point, I forgot to check that section of the spec.