Xing.
Xing.
What are the causes of the following problems? Missing key(s) in state_dict: "quantizer.vq.layers.12._codebook.inited", "quantizer.vq.layers.12._codebook.cluster_size", "quantizer.vq.layers.12._codebook.embed", "quantizer.vq.layers.12._codebook.embed_avg", "quantizer.vq.layers.13._codebook.inited", "quantizer.vq.layers.13._codebook.cluster_size", "quantizer.vq.layers.13._codebook.embed", "quantizer.vq.layers.13._codebook.embed_avg", "quantizer.vq.layers.14._codebook.inited", "quantizer.vq.layers.14._codebook.cluster_size", "quantizer.vq.layers.14._codebook.embed", "quantizer.vq.layers.14._codebook.embed_avg", "quantizer.vq.layers.15._codebook.inited", "quantizer.vq.layers.15._codebook.cluster_size", "quantizer.vq.layers.15._codebook.embed", "quantizer.vq.layers.15._codebook.embed_avg", "quantizer.vq.layers.16._codebook.inited", "quantizer.vq.layers.16._codebook.cluster_size",...
Hi yangdongchao! When I train the Encodec 24k_240 in 1kbps during the early stages, the model exhibits very high loss and significant oscillation. Is this a normal phenomenon? ### The...
I found that the Encodec project does not use LM for the codebook compared to Facebook's Encodec. Have you made any attempts to do this?
How can solve this question?
 How to resolve this problems?