Antoine GUEDON
Antoine GUEDON
Hello @cubantonystark, Thank you for your reply, and your images! When you say that you hardcoded the values for `high_poly`, you mean that you changed it in the python script?...
Hello @hanjoonwon, I suppose the lower part of the battery is badly reconstructed mainly because of the transparent pedestal. A transparent object like this one is a nightmare for mesh...
Hello @yuedajiong , I'm happy to see SuGaR applied to more datasets! Is this mesh the output of the coarse density script, or the refinement script? I suppose this is...
You're right, density-mode actually works better for this kind of scene. But after the coarse extraction, the final refinement phase (train_refined.py) helps to smooth the mesh and get a good-looking...
Thank you for your feedback! Indeed, I'm quite surprised by your results, as I experimented with many custom datasets on my side, and the output mesh is generally good, even...
Hello @SeanGuo063, Thank you so much for your nice words! You're right, in practice, $\hat{f}$ is just an _estimator_ of the real SDF, so there may be cases where $\hat{f}$...
Hey @pknmax, Sure, it's possible to do that, we provide several `train_*.py` and `extract_*.py` for that. Basically, the `train.py` script runs the full SuGaR pipeline, which is equivalent to running...
Hey @kusstox, Thanks for you nice words! Good news is, I know why you have these artifacts. This is due to the fact that (a) your images do not cover...
Hello @MagicJoeXZ, The number of iterations might affect the results, but I don’t think that’s the main issue here. Can I ask, what do your capture/training images look like? Did...
You're welcome @MagicJoeXZ, I’m always happy to help! Okay, since you took a 360° shot around the chair, it's expected to get a nice-looking chair but a messy background. If...