gdjmck

Results 2 issues of gdjmck

I read the tridentnet paper briefly and wondered if it should be using the same conv weight for different dilated conv kernels

The depth of the feature_maps, aka the depth of ```Mixed_6e``` from Inception_v3, is 768 and by default 32 attention_maps are generated, then after the BAP module, the width and height...