Slicing decimal numbers ending in ".000..."
Summary When one writes a float number ending in ".0" to a JSON file. Then Bridge copies the code to the game folder with these decimals removed. The problem with this is, when using entity float properties and setting a range or the default value with a ".0", then the game gives an error (the "Error loading property: 'default'/'range' value/array does not match the specified type 'float'"). Since it requires that these values include the decimals that Bridge removes.
To Reproduce
- Create/modify an entity in Bridge.
- Add an entity property of type float.
- Enter a float number ending in ".0" in "default" or within the range. For example:
"example:remaining_fuel": { "type": "float", "range": [0.0, 30.0], "default": 0.0 }
- Save the file and start a world in Minecraft with the test entity
Observed behavior Minecraft throws the error "Error loading property: 'default'/'range' value/array does not match the specified type 'float'". And when one check the entity file in the game folder, the property's decimal numbers will not have the ".0" (the cause of the error in the game) Expected behavior Bridge saves the entity file inside the folder without removing the ".0" from the property. And Minecraft starts the world without a properties error
Screenshots / File Attachments Example entity file: entity_example.json
Property in Bridge:
Property in game folder:
Game error:
Platform:
- OS: Windows 10
- App Version: 2.7.28
This is an important issue for us to address. I estimate that it will take significant development to properly fix this, however I have found a work around. The default value can be stored as a string, and then adding .00000001 to the end of the range values allows Minecraft to accept it with minimal impact.
Notes: This may require either using the json c parser, or writing a custom one. I'll need to look into updating the tree editor as well as dash to use this. We'll need to make sure this doesn't impact performance as well.
Bumping this one up since it's still existing. Thanks for the issue and temporary work around.
Notes: For sake of development time, a new dash extension that simply correct the issue could be used, and then a longer term solution is custom JSON parsing which can be looked into later.