ghoshaw
ghoshaw
Thanks, I solved this problem by add 'optimizer = None ' in .theanorc. But I come to a new problem. I don't know if it is because I use theano...
sorry, I used python 2. I will try this again with python 3.
Thanks for your patience. I am a little confused about the file names of the train samples, could you game me some examples? Thanks very much. 2017-05-02 ghoshaw 发件人:Ollin Boer...
Thanks very much, this helps a lot! 2017-05-03 ghoshaw 发件人:Ollin Boer Bohan 发送时间:2017-05-03 00:20 主题:Re: [madebyollin/acapellabot] AssertionError: AbstractConv2d Theano optimization failed (#3) 收件人:"madebyollin/acapellabot" 抄送:"ghoshaw","Author" You can configure the file parsing...
@alchemz , Sorry, I do not know where the .theanorc file is, but you can try to create one in ~/.theano if you can not find it.
如果加了 training_epoch_end 的实现,不需要在这个函数下面将所有的卡的loss gather到一起吗? 对于validation_epoch_end 不需要将所有卡的 结果 gather到一起吗?
@YunYang1994 , thanks for your answer. And how to use multi-gpus in yolov3?
我也遇到这种情况了,请问找到原因了吗
同问,稀疏化训练能达到原始模型的精度吗,还是说一定要达到原始模型的精度,才能开始剪枝呢? 我稀疏化训练后比原始模型掉了5个点,然后剪枝后finetune,怎么都达不到原始模型的精度,就是比稀疏化模型好一点点。
The line "cur_gt_corners = tf.transpose(tf.tile(tf.Variable([[gx0, gy0, gx1, gy1, gx2, gy2, gx3, gy3, gx4, gy4, gx5, gy5, gx6, gy6, gx7, gy7, gx8, gy8]], trainable=False), (nAnchors, 1)), (1, 0)) " in 316...