Yao-Hung Hubert Tsai

Results 10 comments of Yao-Hung Hubert Tsai

Can you try this alternative codebase: https://github.com/yaohungt/Capsules-Inverted-Attention-Routing This uses less memory and has better inference speed.

Hi, can you be more specific? If your input has a larger size, then you may need a larger network to fit the training.

I haven't seen your code, but my guess is because of your input size: 84x84x1. While CIFAR10 has 32x32x3. You can modify the config file in ./configs so that the...

I'm not sure. I think you can print 1) the shape of the default CIFAR10 data; and 2) the shape of your own data. They shall look alike.

You can modify the config based on Tables 10-12 in the paper.

It seems syntactically correct.

I have no access to machines right now. But your config file is correct except that "True" should be "true". We also conduct a variant of your config which is...

I'm not sure your training/test match the setting in the paper or not. You can first try the vanilla Capsules (Dynamic/ EM Capsules) on multimnist, and see what you get....

Doing this can save lots of parameters. You can try directly from 800 to 100, but this requires many more parameters and memories.

I haven't been aware of any similar work that combines both.