Results 4 comments of Kai Xiong

You may check your data_folder in data_generator.

I have a similar question about the gradient. Acturally, after `lnet.load_state_dict(gnet.state_dict())` being excuted, ~~all the parameters in both `lnet` and `gnet` are shared~~. That is to say, the `opt.zero_grad()` will...

Thanks for your reply! @MorvanZhou 1. Yes, the `load_state_dict()` will not make parameters shared. I found it and had scratched out the sentence before. 2. It will be very kind...

@nmerrill67 Hi, thanks for your reply. I have another question and expect your help. I noticed that the evaluation results are based on a final global BA optimization [here](https://github.com/rpng/suo_slam/blob/5de01433d177fde5cac4423f05fd554e3c00794e/evaluate.py#L256). Then,...