Beniko_J
Beniko_J
Hi, may I ask how did you achieve 20Hz stereo images? I am also trying to run [VINS-Mono](https://github.com/HKUST-Aerial-Robotics/VINS-Mono) on AirSim, and when I set the image resolution to 288x512 (original...
Also looking forward the training configuration for the DTU dataset.
Can this bug be fixed by passing `invert=True` in line [100](https://github.com/nianticlabs/monodepth2/blob/ab2a1bf7d45ae53b1ae5a858f4451199f9c213b3/evaluate_pose.py#L100)?
Thanks for the reply! @mrharicot I think your idea and the idea expressed in [Visual Chirality](https://linzhiqiu.github.io/papers/chirality/) do give some insights on this problem. > I don't suppose you also tried...
Hi, thank you for telling me about this. I just found that there was a file named 'evaluation.py' in the project so I just deleted my question. Never mind!
Hi, thank you for the reply. Yeah my question is about how to get an estimated T_w_c. Your world frame is not arbitrary but built on the ground plane so...
Hi, thank you for the reply! I think it is still unclear why using depth supervision may give us suboptimal results, especially in the case of using more than 20...
Hi, I also found that sigma_loss can not be used in the code of current version. It seems that `depth` needs to be passed into `render(...)` function. Have you succeeded...
So can I think the matching results of LoFTR as sparse optical flow, and something like **forward-backward consistency check** can be applied to obtain better results?
Same issue here.