how to visualize the train step
hello , I want to visualize the predicted image when training to show the efficiency of ngp. Is there any relevant settings in the code ?
You can uncomment here:
https://github.com/KAIR-BAIR/nerfacc/blob/107d7466ef3fcfadd64e8b57ad3c1aafd4874fe0/examples/train_ngp_nerf.py#L298
thanks for your reply. Is it possible to use nerfacc to speed up human-nerf (such as animatable_nerf) training? I tried to embed nerfacc in it, but found that some of the Settings were hard to align..
You can refer to the d-nerf example. human-nerfs are very much like that. You just need to have a more customized warping field conditioned on human pose instead of timestamps.
Checkout the explanation of the d-nerf usage here: https://www.nerfacc.com/en/latest/examples/dnerf.html
I tried but got empty ray_indices when eval. It's weird. Do you know what might have caused it. Or do you have plan to give an example on zju dataset?
Can't tell where go wrong. ray_indices is empty means either an all-zero occupancy grid, or improper bounding-box. You can debug by tracing a single sample where you are certain it should not be zero density, and see at which step it was throw away.
Having an example for human-nerfs would be nice as I also work in this field (checkout my latest paper here: https://github.com/facebookresearch/tava), but there are many other TODOs have higher priority than this.
If you make it work please lmk and I'm happy to accept PR / link to your repo.