nerfplusplus
nerfplusplus copied to clipboard
why sample t0 linearly from 0 to 1 in order to get a linear sampling in disparity
I can understand the paper moving the camera position origin to the z = -n plane.
t = -(near + rays_o[..., 2]) / rays_d[..., 2]
rays_o = rays_o + t[..., None] * rays_d
But I don't understand why t0 changes linearly from 0 to 1 in NDC space. Because according to the formula, it should be - 1 to 1.
You might want to check this thread: https://github.com/bmild/nerf/issues/18.