selene icon indicating copy to clipboard operation
selene copied to clipboard

Support for other schedulers and early stopping

Open rfriedman22 opened this issue 3 years ago • 2 comments

I was hoping to use a different learning rate scheduler than the default implementation (reduce by 0.8 after a validation loss plateau with patience of 16 epochs). Unfortunately it seems like adding this to the codebase might be rather hairy. My current idea is to add the scheduler information (except for the optimizer) in the config file, then pass a _Proxy object with that information to TrainModel. Then the optimizer can be bound to the proxy and a scheduler object instance can be created when _init_train is called. Does this seem like an appropriate approach, or is there a better design solution?

It would also be useful to add support for early stopping. As I see it, this seems fairly straightforward to me -- all that is needed is to add information in the configs about which metric should be used for early stopping and how much patience. That information can be bound to TrainModel and utilized within train_and_validate.

rfriedman22 avatar Jun 13 '22 22:06 rfriedman22

Hi Ryan,

Sorry for the late response! Yes I think that's an appropriate way to add it for now - do you have some working code for this already you could make a pull request with?

kathyxchen avatar Jul 07 '22 19:07 kathyxchen

Thanks Kathy! I wrote a hack to do this in my own code and it works. I haven't incorporated it into the actual SDK yet -- was waiting to get confirmation that my approach seemed reasonable -- but I will work on this and make a PR soon.

rfriedman22 avatar Jul 08 '22 14:07 rfriedman22