Hank Kung
Hank Kung
> @mrluin hi, I've bounced back and forth about how is best to do this. The authors doing it this way doesn't automatically make it the best approach. I think...
architecture search results: [1 1 2 2 2 3 3 2 1 2 3 3] new cell structure: [[ 0 4] [ 1 4] [ 4 5] [ 2 4]...
> @HankKung Hi, Thanks for your hard work! > > > This is the one I've added pre_pre_input for those edge tensors. > > Does the way you add pre_pre_input...
Yes you are right about this.
I believe that the code works this way already. The optimizer of the model only contains wight parameters and the optimizer in architecture does alpha and beta only. Please correct...
> @HankKung 请问您后来解决此问题了吗?谢谢! I haven't worked on this part yet. I've been running on a single GPU currently. Does someone know how to make it?
I think it might be the reason why DeepLab team searches the architecture on Cityscapes, and according to the paper, they don't optimize the architecture parameters before first 20 epochs...
>  > still low mIOU Isn't the original mIOU on paper 35%? Can you share your hyper-parameters or did you edit the search method? Because the best I can...
I'm afraid that we couldn't search for good architecture because the code isn't well reproduced. There was a syntax error in model_serach.py which caused that there's only one operation (dil_conv_5x5)...