Group-Normalization-Tensorflow icon indicating copy to clipboard operation
Group-Normalization-Tensorflow copied to clipboard

there is a errer when runing

Open wxde opened this issue 7 years ago • 11 comments

this line of code turn a error when runing

x = tf.reshape(x, [N, G, C // G, H, W])

typeerror: failed to convert object of type <class 'list'> to tensor. contents: [none, 32, 2, none, none]. consider casting elements to a supported type.

wxde avatar Mar 25 '18 16:03 wxde

Could you provide the command you tried to run and the version of your TensorFlow?

shaohua0116 avatar Mar 25 '18 23:03 shaohua0116

the code is a bit long,so i just provide the main code as follow conv1 = tf.nn.conv2d(x, filter=[3, 3, 1, 64], strides=[1, 1, 1, 1], padding='SAME') conv1_bn = group_norm(conv1)

my version of tensorflow is 1.4 and python is 3.6 when python main.py , the error is follow

x = tf.reshape(x, [N, G, C // G, H, W]) typeerror: failed to convert object of type <class 'list'> to tensor. contents: [none, 32, 2, none, none]. consider casting elements to a supported type.

wxde avatar Mar 26 '18 01:03 wxde

Are you using the norm code I implemented in ops.py or the snippet provided in the paper? I do not think I clearly understand your code, including where do you define x and what is group_norm.

shaohua0116 avatar Mar 26 '18 01:03 shaohua0116

i use the norm code ,just the name of function is differ from you. x = tf.placeholder(tf.float32,[None, None, None, 1], name='x_input')

wxde avatar Mar 26 '18 02:03 wxde

I cannot debug with this piece of information. My guess is that tf is complaining about using None to specify a tensor shape. Maybe you can try to replace the line x = tf.reshape(x, [N, G, C // G, H, W]) with x = tf.reshape(x, tf.TensorShape([N, G, C // G, H, W])). I can try to help if more information is provided.

shaohua0116 avatar Mar 26 '18 02:03 shaohua0116

thanks@shaohua0116, I made it by feeding x a fixed shape. However, the test result was bad when I replaced BN by GN. my data batch is 4.

wxde avatar Mar 26 '18 14:03 wxde

can you explain feeding x a fixed shape?befor input x into GN?

dddddkkk avatar Apr 18 '18 08:04 dddddkkk

I also find this problem. it may be a bug of tensorflow. you can use x = tf.reshape(x, [-1, G, C // G, H, W]) It work !

harrylin-hyl avatar May 10 '18 10:05 harrylin-hyl

I have a grayscale 3D image where I have let's say N samples with dims of H, W, D and C=1, what would be the proper way of normalizing it using your group norm routine?

soheilesm avatar May 30 '18 22:05 soheilesm

This is the way I did it.

def group_norm(x, G=32, eps=1e-5, scope='group_norm') :
    with tf.variable_scope(scope) :
        N, D, H, W, C = x.get_shape().as_list()      
        G = min(G, C)

        x = tf.reshape(x, [-1, D, H, W, G, C // G])
        mean, var = tf.nn.moments(x, [1, 2, 3, 5], keep_dims=True)
        x = (x - mean) / tf.sqrt(var + eps)
        # with tf.variable_scope("group_norm", reuse=tf.AUTO_REUSE):
        gamma = tf.get_variable('gamma', [1, 1, 1, 1, C], initializer=tf.constant_initializer(1.0))
        beta = tf.get_variable('beta', [1, 1, 1, 1, C], initializer=tf.constant_initializer(0.0))
        x = tf.reshape(x, [-1, D, H, W, C]) * gamma + beta

    return x

But if I use group norm multiple times I get dimension error for beta and gamma since they will have a different value of C along different layers. I still cannot solve the issue, using setting reuse to false, or none or doing local_variable_initializer, I could find a stupid way around it but creating multiple functions of group norm each with a different name for betta and gamma.

Can you suggest a smarter way? did you get into this issue of already existing variables gamma and beta and not sharing them?

soheilesm avatar May 31 '18 00:05 soheilesm

This is the way I did it.

def group_norm(x, G=32, eps=1e-5, scope='group_norm') :
    with tf.variable_scope(scope) :
        N, D, H, W, C = x.get_shape().as_list()      
        G = min(G, C)

        x = tf.reshape(x, [-1, D, H, W, G, C // G])
        mean, var = tf.nn.moments(x, [1, 2, 3, 5], keep_dims=True)
        x = (x - mean) / tf.sqrt(var + eps)
        # with tf.variable_scope("group_norm", reuse=tf.AUTO_REUSE):
        gamma = tf.get_variable('gamma', [1, 1, 1, 1, C], initializer=tf.constant_initializer(1.0))
        beta = tf.get_variable('beta', [1, 1, 1, 1, C], initializer=tf.constant_initializer(0.0))
        x = tf.reshape(x, [-1, D, H, W, C]) * gamma + beta

    return x

But if I use group norm multiple times I get dimension error for beta and gamma since they will have a different value of C along different layers. I still cannot solve the issue, using setting reuse to false, or none or doing local_variable_initializer, I could find a stupid way around it but creating multiple functions of group norm each with a different name for betta and gamma.

Can you suggest a smarter way? did you get into this issue of already existing variables gamma and beta and not sharing them?

I run into the same error. The current solution is reshape x into [-1, C, D, H*W] and multiply gamma with shape [1, C, 1, 1], and same for the beta. Have you found any trick to address to address this issue?

YongyiTang92 avatar Dec 22 '18 05:12 YongyiTang92