こしあん

Results 35 comments of こしあん

@gogo03 The number of layers was different. Thank you. I was wondering that WRN should not be so poor without OctConv. Here is the WRN 28-10 implemented by [the auto...

@gogo03 Hi, I fixed the bug that the number of layers of wide resnet is different. It has been committed in the repository. The learning curve and prediction time are...

@gogo03 Hi. Thank you for sharing the result. Cutout is data augmentation, OctConv is network structure, and it can not be simply compared with accuracy. We can improve the accuracy...

@a5372935 (My explanation will be longer, but don't feel bad) One of the reasons that GPU usage the goes down when increasing the alpha is that the theoretical FLOPs is...

@supermark1995 (1) Conv layer is required twice for each module due to ResNet structure. (2) I don't remember detailed loss values. Isn't val loss lowered by executing according to this...

@DERACCO May I ask you a question. Is +0.5 in this code for floating point bugs? ```python skip_high = layers.Conv2D(int(ch*(1-alpha)+0.5), 1)(high) ```

@a5372935 Hi, This code's OctConv assumes that the input resolution is even. If it is an odd number such as 75, adjust it using ZeroPadding etc.

@jiafengshen I do not understand Chinese well, but I think I'm asking about the structure of ResNet with OctConv, so I will explain. First, let's look back on the operations...

@AakashKumarNain In the above figure, **ResBlock with OctConv** is a general case where alpha is 0 to 1. If alpha is 0 or 1, it will be equal to the...

@AakashKumarNain Sorry, my reply was late. This project is a MIT license. You can use freely within it.