Ensemble-Pytorch icon indicating copy to clipboard operation
Ensemble-Pytorch copied to clipboard

Adding the LBFGS optimizer from PyTorch

Open e-eight opened this issue 4 years ago • 12 comments

Hi,

I am trying to use the BaggingRegressor model, with shallow estimators, on a small dataset, for which the LBFGS optimizer usually gives good results with a single estimator. However I see that the LBFGS optimizer in PyTorch is not included in the accepted list of optimizers for torchensemble. Will it be possible to add the LBFGS optimizer to the accepted list of optimizers, or is there any way that I can use the LBFGS optimizer with torchensemble for my work?

Thanks

e-eight avatar Jun 09 '21 20:06 e-eight

Hi @e-eight, thanks for reporting! Could you provide me with an example on how to use the LBFGS optimizer, for example:

for batch_idx, (data, target) in enumerate(dataloader):
    # Here is your code

According to this introduction, it looks like using the LBFGS optimizer is different from other optimizers.

xuyxu avatar Jun 10 '21 00:06 xuyxu

You can try it this way:

for batch_idx, (data, target) in enumerate(dataloader):

    # Code for sampling with replacement for bagging, or the corresponding code
    # for other models.

    # Optimization
    def closure():
        if torch.is_grad_enabled():
            optimizer.zero_grad()
        sampling_output = estimator(*sampling_data)
        loss = criterion(sampling_output, sampling_target)
        if loss.requires_grad:
            loss.backward()
        return loss

    optimizer.step(closure)

    # If you want to calculate the loss for monitoring:
    sampling_output = estimator(*sampling_data)
    loss = closure() # You can use this however it is preferred.

This way of optimizing should work with both LBFGS and other optimizers, such as Adam, at least it has worked for me with single estimators. You might find more details on the LBFGS optimizer here.

e-eight avatar Jun 10 '21 02:06 e-eight

Thanks for your explanation. After reading the introduction, I think there should be no problem on supporting the LBFGS optimizer, wondering that if you are interested in working on this feature request ;-)

xuyxu avatar Jun 10 '21 11:06 xuyxu

Sure, I will be happy to work on it!! I will get started on it then, and comment here if I face any problems.

e-eight avatar Jun 10 '21 11:06 e-eight

Glad to hear that 😄. Here are some instructions on what to do next:

  • Add your contribution in CHANGELOG.rst
  • Add the LBFGS optimizer to set_optimizer method in torchensemble/utils/set_module.py
  • Update the docstrings __set_optimizer_doc in torchensemble/_constants.py
  • Try to modify the training lop in torchensemble/fusion.py to see if the LBFGS optimizer and other optimizer both work as expected. After then, we could modify other ensembles similarly.

Feel free to ask me anything in this issue or your pull request.

xuyxu avatar Jun 10 '21 12:06 xuyxu

@all-contributors please add @e-eight for code

xuyxu avatar Jun 10 '21 12:06 xuyxu

@xuyxu

I've put up a pull request to add @e-eight! :tada:

allcontributors[bot] avatar Jun 10 '21 12:06 allcontributors[bot]

@e-eight it would be better to open a PR on your own.

xuyxu avatar Jun 10 '21 23:06 xuyxu

I have written the code, but I am not sure what is the best way to test it. I was thinking about doing the Year Prediction example in the examples folder, but with the LBFGS optimizer. Do you have any suggestions? Thanks!

e-eight avatar Jun 12 '21 17:06 e-eight

Hi @e-eight, I am not sure if I understand your problem correctly. Perhaps you could open a pull request based on your current code, and we can then have a discussion there. For now, there is no need to pass all checks, simply upload your code, so that I can take a look and better understand your problem ;-)

xuyxu avatar Jun 13 '21 02:06 xuyxu

Added pull request #81.

e-eight avatar Jun 15 '21 16:06 e-eight

Thanks @e-eight for your PR. Kind of busy in two days. I will get back to you soon.

xuyxu avatar Jun 15 '21 21:06 xuyxu