Skip to content

Treat UTF-16 strings in binary VDF as little-endian#5

Merged
WinterPhoenix merged 1 commit intosolsticegamestudios:masterfrom
smcv:utf16le
Feb 6, 2026
Merged

Treat UTF-16 strings in binary VDF as little-endian#5
WinterPhoenix merged 1 commit intosolsticegamestudios:masterfrom
smcv:utf16le

Conversation

@smcv
Copy link
Copy Markdown

@smcv smcv commented Feb 1, 2026

Integers in binary VDF are already treated as little-endian (least significant byte first) regardless of CPU architecture, but the 16-bit units in UTF-16 didn't get the same treatment. This led to a test failure on big-endian machines.

Resolves: ValvePython#33

Integers in binary VDF are already treated as little-endian (least
significant byte first) regardless of CPU architecture, but the 16-bit
units in UTF-16 didn't get the same treatment. This led to a test failure
on big-endian machines.

Resolves: ValvePython#33
Signed-off-by: Simon McVittie <smcv@debian.org>
@smcv
Copy link
Copy Markdown
Author

smcv commented Feb 1, 2026

Previously proposed at ValvePython#57.

@WinterPhoenix
Copy link
Copy Markdown
Member

WinterPhoenix commented Feb 6, 2026

Sounds good to me. I don't know of any situation where something might use UTF-16BE, since UTF-16 is pretty much always Windows crap, and therefore LE.

Thanks!

@WinterPhoenix WinterPhoenix merged commit 3d391b1 into solsticegamestudios:master Feb 6, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

BinaryVDF.test_loads_utf16 not passed

2 participants