Simon Diener

Results 2 comments of Simon Diener

I am very sorry, i miscalculated the padding, therefore the result is the same as when using the kernel size 4. However i still am not able to see the...

Hello, yes i had sadly misunderstood the usage of the attention module for CaraNet. I initially thought, that they implemented the code similar, but ended up not using the axial...