vdf
vdf copied to clipboard
Treat UTF-16 strings in binary VDF as little-endian
Integers in binary VDF are already treated as little-endian (least significant byte first) regardless of CPU architecture, but the 16-bit units in UTF-16 didn't get the same treatment. This led to a test failure on big-endian machines.
Resolves: https://github.com/ValvePython/vdf/issues/33
~~I still need to test this on a big-endian machine, I'll mark as non-draft after that.~~
The CI failures are expected, see #56 for fixes for those.
Verified to pass tests on a Debian-developer-accessible s390x (big-endian) machine.