taming-transformers
taming-transformers copied to clipboard
lr scheduler
Hi, there. I met with such a warning that "You are using LearningRateMonitor callback with models that have no learning rate schedulers. Please see documentation for configure_optimizers method." when training a transformer model on FFHQ and Celeba-HQ. Could you please tell me what kind of lr scheduler did you use and if possible, other hyperparameters like lr, batch size, etc during training. Thanks!
Hi, have you solved it ?