Error while creating data: "struct.error: int too large to convert"
Using NBT 1.5.1 on MacOS 12.2.1 I have problems with creating NBT data. The main code is as follows:
chunk_data = BytesIO()
nbt_data = self.get_nbt()
nbt_data.write_file(buffer=chunk_data) # ERROR LINE
where I get the following error:
File "/Users/a/Private/Minecraft/PyBlock/PyBlock/pyblock/chunk.py", line 248, in get_bytes
nbt_data.write_file(buffer=chunk_data)
File "/Users/a/Private/Minecraft/PyBlock/venv/lib/python3.8/site-packages/nbt/nbt.py", line 712, in write_file
self._render_buffer(self.file)
File "/Users/a/Private/Minecraft/PyBlock/venv/lib/python3.8/site-packages/nbt/nbt.py", line 515, in _render_buffer
tag._render_buffer(buffer)
File "/Users/a/Private/Minecraft/PyBlock/venv/lib/python3.8/site-packages/nbt/nbt.py", line 427, in _render_buffer
tag._render_buffer(buffer)
File "/Users/a/Private/Minecraft/PyBlock/venv/lib/python3.8/site-packages/nbt/nbt.py", line 515, in _render_buffer
tag._render_buffer(buffer)
File "/Users/a/Private/Minecraft/PyBlock/venv/lib/python3.8/site-packages/nbt/nbt.py", line 515, in _render_buffer
tag._render_buffer(buffer)
File "/Users/a/Private/Minecraft/PyBlock/venv/lib/python3.8/site-packages/nbt/nbt.py", line 317, in _render_buffer
buffer.write(self.fmt.pack(*self.value))
struct.error: int too large to convert
The interesting thing is: Sometimes there is no problem! I run the EXACT same code, and I DO NOT GET ANY ERROR. In most cases I do get this error above. But in some cases, all is working fine.
Maybe that is something known? I will try to """reproduce""" this error. Not sure if that is possible at all.
I tried to create a "reproducible" example by writing the nbt data to a file. But either I get the exact same error when doing
nbt_data.write_file("test.nbt")
or it works, and I can use the correct content of test.nbt. Which does not help debug anything...
When I print out the value of self.value in the line I get the error, the values are always different. It is random!
I just figured out why I was getting this problem and here it goes: Although the chunk data is stored in what looks simply like 64 bit integers, it is actually a SIGNED long. This means you need the leftmost bit to represent a minus sign when dealing with it in python. I wrote an awful piece of code to convert the "too big" unsigned data element to a signed one that can be passed
data = int.from_bytes((data).to_bytes(8, 'big', signed=False), 'big', signed=True)
I guess it's been a few months but if you see this hello!