zhenmafan7
zhenmafan7
I'm sorry to bother u,I want to ask in dataset.py, "labels = np.load('/home/liekkas/DISK2/jian/PASCAL/VOC2012/cls_labels.npy')[()]" I can't find cls_label.npy,what should I do to slove this problem.(明明都是中国人,但是我还是要用渣英文来求助,心好累哦,哭)
> 这个是分类的标签,作者没有提供,你可以把分类那个分支给去掉,然后这句话可以注释掉,不输出分类的标签。 我知道有点蠢,但是我还是想问,分类的分支是哪个部分啊?TAT
> classifier模块(不是mask_classifier)就是分类的分支,你把相关部分都删掉就可以。 谢谢你,我理论上知道了,实际上操作...我慢慢、慢慢、慢慢、慢慢琢磨吧(真难TAT) thank you ありがとうございます 감사합니다
不好意思再次打扰你,我按照你所说把分支模块删除后进行训练,随后运行eval.py时加了print打印了测试结果(如果不加我看不到任何结果,不知道是不是代码哪里改错了),打印的结果如下: Length of test set:1449 Each_cls_IOU:{'background': 0.0, 'aeroplane': 0.0, 'bicycle': 0.0, 'bird': 0.0, 'boat': 0.0, 'bottle': 0.0, 'bus': 0.0, 'car': 0.0, 'cat': 0.0, 'chair': 0.0, 'cow': 0.0, 'diningtable': 0.0, 'dog':...
> > 不好意思再次打扰你,我按照你所说把分支模块删除后进行训练,随后运行eval.py时加了print打印了测试结果(如果不加我看不到任何结果,不知道是不是代码哪里改错了),打印的结果如下: > > Length of test set:1449 Each_cls_IOU:{'background': 0.0, 'aeroplane': 0.0, 'bicycle': 0.0, 'bird': 0.0, 'boat': 0.0, 'bottle': 0.0, 'bus': 0.0, 'car': 0.0, 'cat': 0.0, 'chair': 0.0, 'cow':...
can u share the cls_labels.npy file? or where can i find it?
I can't run denseExtraction.py successfully,so I revised it,but the .npz files I had generated are very large such as :X_train_set_512_0006.npz=3.0G Y_train_set_512_0006.npz=8.4G Is that correct?
And what I enter to the terminal is "python train.py -n bdcnn -trp /home/wq/code/semantics_segmentation_of_urban_environments-master/Cityscapes_dataset/leftImg8bit/dense_train_set_512 -vdp /home/wq/code/semantics_segmentation_of_urban_environments-master/Cityscapes_dataset/leftImg8bit/dense_validation_set_512 -tsp /home/wq/code/semantics_segmentation_of_urban_environments-master/Cityscapes_dataset/leftImg8bit/dense_test_set_512 -bs 4 -crf -e 20"
Using TensorFlow backend. --------------------------------------- Train Set Size -----> 6239027200 ---------------------- Validation Set Size -----> 1048576000 --------------------------------------- Test Set Size -----> 9594470400 Traceback (most recent call last): File "train.py", line 151,...
I used the denseExtraction.py to generate new .npz files but when I trying to run train.py the same error ouccured: Traceback (most recent call last): File "train.py", line 172, in...