[bug] z.shape Q.shape is not same in graphlayer
in the code above:
z shape is (self.batch, self.d , self.htself.wdth , self.no_of_vert)
while Q shape is (self.batch, self.htself.wdth, self.d , self.no_of_vert)
When the code is executed to z = torch.mul(z,Q), an error will be reported
Hi, the shape of Q is (B, HW, V) and that of Z is (B, d, HW, V). So when torch.mul(Z,Q) is used, output shape will be (B,d, HW, V). You can try it with some random vectors this should not report an error. Did you face any such error while running the code?
yes, there reports this error when I run the code :
we can see that the shape of z is [4, 2048, 4096, 2], while Q is [4, 4096, 2].
I replace func GraphProject_optim with GraphProject, this error disappear
Another question, I encounter this error:
so I move the conv1 before upsample
I wonder why put the conv1 affter upsample?
Regarding torch.mul(z,Q) I realized it will throw an error if batch size if greater than 1. Thanks for pointing it out. You can reshape Q as (B, 1, HW, V) to get rid of the error. GraphProject function is just a non vectorized version of same thing.
Final conv1 it the one which makes projects the channel to the number of classes. Please go through the paper mentioned in the readme. If you remove the conv1 it might result in an error or give wrong results. The error regarding memory will go if you reduce your batch size, you can try it with batch size = 1. The code needs to be optimized further to make it more memory efficient.
Another small thing, in file graphLayer.py in line 193 you need to change Z(capital Z) to z(small z) in torch.div(Z, norm).