How to train the vqvae?
Could you release the code for training VQVAE?
@Longhzzz please refer to https://github.com/FoundationVision/vaex
@Longhzzz please refer to https://github.com/FoundationVision/vaex
I downloaded the code from the link and found that I need some local weights to run it. In utils/arg_utils.py lines 270-272, could you make the weight files C:\Users\16333\Desktop\PyCharm\vgpt_vae\lpips_with_vgg.pth and C:\Users\16333\Desktop\PyCharm\vgpt_vae\vit_small_patch16_224.pth publicly available, as well as any other weight files needed to run the training VAE code?
@vxlot please refer to : vit_small_patch16_224.pth : https://huggingface.co/timm/vit_small_patch16_224.dino lpips_with_vgg.pth: https://huggingface.co/spaces/multimodalart/vqgan/blob/dec38285640c45fc3f8377a9726daf6e0de08d6a/taming/modules/autoencoder/lpips/vgg.pth
@vxlot please refer to : vit_small_patch16_224.pth : https://huggingface.co/timm/vit_small_patch16_224.dino lpips_with_vgg.pth: https://huggingface.co/spaces/multimodalart/vqgan/blob/dec38285640c45fc3f8377a9726daf6e0de08d6a/taming/modules/autoencoder/lpips/vgg.pth
"Hello, in the code from the VAE repository: dd.heads.load_state_dict(heads.state_dict()) Are the weights of the 'heads' initialized by default, or do the 'heads' need to be loaded, but the corresponding files are currently not provided? Meanwhile, the results I generated are very different from what is shown in o: 50.266475677, 91.698143005 Could you help answer this question?"