Jasper - fine tuning on pretrained model, with different language/different label list
When i add some additional labels, into the jasper10x5dr_speedp-online_speca.yaml file and run train.py, i get the following error:
size mismatch for decoder.layers.0.weight: copying a param with shape torch.Size([29, 1024, 1]) from checkpoint, the shape in current model is torch.Size([32, 1024, 1]). size mismatch for decoder.layers.0.bias: copying a param with shape torch.Size([29]) from checkpoint, the shape in current model is torch.Size([32]).
Is there any way to fine tune from the english ASR model, with a different dataset/different label list? I know that there was an --warm_start option for Text to speech Tacotron2 for this specific problem, is here in Jasper maybe some similar parameter to use?
Best regards jereb321
Did you find any solution? Having the same issue/question.