Can you share full training code? Thank you!
Hi,
Thanks for the great work. I tried to run the train.py but "val_encodings.npy" does seems to be generated by data.py. Can you share the complete data generation/training code, thank you!
Hi @ericwudocomoi I added a notebook in the https://github.com/apapiu/transformer_latent_diffusion?tab=readme-ov-file#usage tab: if you look in the notebook it will download some already preprocessed data including the val encodings and do a train run. I will shortly add a notebook to process a random dataset as well.
Thank you!
@ericwudocomoi Ok added another notebook in the https://github.com/apapiu/transformer_latent_diffusion?tab=readme-ov-file#usage subsection that should help you preprocess the images and text on your own dataset. Let me know if this helps and you're able to run it on your own.