implicit icon indicating copy to clipboard operation
implicit copied to clipboard

Add learning rate scheduling feature

Open ykskks opened this issue 5 years ago • 0 comments

Hello!

Idea for new feature

Currently, only fixed learning rate is supported during the training of the model. However, it has been shown that scheduling learning rate is useful for Deep Learning model to get out of local minima and provide better performance in the end. I think the same applies for matrix factorization models and in fact the final model performance improved in my local experiment I did with BPR.

Idea for implementation

Currently, I add Scheduler class in utils.py and pass the instance of Scheduler to BayesianPersonalizedRanking.__init__(). Scheduler will track the best score achieved till now and if the model performance doesn't improve for patience iteration, it will update self.learning_rate. We need to get some sort of evaluation metric to use for assesing the model performance. I currently add get_metric_on_val method to BayesianPersonalizedRanking and call it every iteration but there should be some other ways too.

If this feature is useful, I would like to work on this.

Thank you!

ykskks avatar Nov 18 '20 08:11 ykskks