Scaling in all layers
I want to know how you are implementing scaling in all the layers in Pytorch, originally which is being done in Caffe model in each layer.
@JK009 if the scaling you mentioned is like y = k*x, it can easily implement in pytorch, only write y = k * x can finish this work, x and y is tensor meanwhile k is a scala.
I want to rephrase my question and ask have you done scaling in the pytorch code of Squeezenext.
is transform.resize equivalent to scale layer, is meant for smooth transition from input to our convolution, did you implement this in your squeezenext code or not ?
output = F.relu(self.bn1(self.conv1(input)))
I don't see any scaling layer in your code.This makes it a little different from caffe implemented base squeezenext arch or not? Please reply asap.