accelerated_features icon indicating copy to clipboard operation
accelerated_features copied to clipboard

Question about the MegaDepth dataset for training.

Open LuoXubo opened this issue 1 year ago • 5 comments

Hi. I'm very interested in your work, XFeat. However, I met some problems when I tried to train the model. I've made the training megadepth dataset according to the instructions in your repository. And the dataset structure is as follows:

    ├── megadepth_root_path
    │   ├── train_data
    │   │   ├── megadepth_indeices
    │   │   │   ├── scene_info_0.1_0.7
    │   │   │   │   ├── 0000_0.1_0.3.npz
    │   │   │   │   ├── ...
    │   │   │   ├── ...
    │   ├── MegaDepth_v1
    │   │   ├── 000
    │   │   │   ├── dense0
    │   │   │   │   ├── depths
    │   │   │   │   ├── images
    │   │   │   ├── dense1
    │   │   ├── ...

The loading part of the dataset in the training code is as follows:

TRAIN_BASE_PATH = f"{config['megadepth_root_path']}/train_data/megadepth_indices"
TRAINVAL_DATA_SOURCE = f"{config['megadepth_root_path']}/MegaDepth_v1"

TRAIN_NPZ_ROOT = f"{TRAIN_BASE_PATH}/scene_info_0.1_0.7"

npz_paths = glob.glob(TRAIN_NPZ_ROOT + '/*.npz')[:]

data = torch.utils.data.ConcatDataset( [MegaDepthDataset(root_dir = TRAINVAL_DATA_SOURCE,
    npz_path = path) for path in tqdm.tqdm(npz_paths, desc="[MegaDepth] Loading metadata")] )

And I met the error when I ran the training code:

Traceback (most recent call last):
  File "train.py", line 279, in <module>
    trainer = Trainer(config_path = '../config.yaml')
  File "train.py", line 39, in __init__
    self.load_config(config_path)
  File "train.py", line 82, in load_config
    data = torch.utils.data.ConcatDataset( [MegaDepthDataset(root_dir = TRAINVAL_DATA_SOURCE,
  File "/home/anaconda3/envs/tavins/lib/python3.8/site-packages/torch/utils/data/dataset.py", line 398, in __init__
    assert len(self.datasets) > 0, 'datasets should not be an empty iterable'  # type: ignore[arg-type]
AssertionError: datasets should not be an empty iterable

I think there must be some problem with the dataset structure. Could you help me with the problem? Or could you please give me an example of the structure of the dataset? Thank you very much.

Looking forward to your reply!

LuoXubo avatar Oct 25 '24 12:10 LuoXubo

Hi @LuoXubo,

Thank you for your interest in our work.

The dataset structure seems correct; however, I noticed a typo, "megadepth_indeices," which could be causing an error. Were you able to resolve the issue?

Kind regards.

guipotje avatar Oct 27 '24 14:10 guipotje

Hi, thanks for your reply. I've fixed the problem :-)

LuoXubo avatar Oct 28 '24 07:10 LuoXubo

嗨,感谢您的回复。我已经解决了这个问题 :-)

Hello, I have the same problem. How did you solve it

easonzzmm avatar Dec 30 '24 08:12 easonzzmm

@guipotje 你好,请问关于Hpatches数据集和Aachen day-night的评估代码有?如果有的话,可以发到[email protected]上?谢谢

2805651606 avatar Mar 17 '25 12:03 2805651606