GLMNet.jl icon indicating copy to clipboard operation
GLMNet.jl copied to clipboard

Relaxed Lasso

Open azev77 opened this issue 4 years ago • 9 comments

The GLMNet package includes the Relaxed Lasso option, which recent research has shown performs very well. Would it be possible for GLMNet.jl to allow this?

azev77 avatar Feb 09 '21 20:02 azev77

I had a quick look through how this is implemented in the R package, and it looks like the logic for the relaxed option sits in the R code, rather than in the core fortran library. So unfortunately it looks like we can't simply access the relaxed option from the core compiled library, instead this R logic would need to be duplicated into the Julia package which is a bigger undertaking.

JackDunnNZ avatar Feb 10 '21 13:02 JackDunnNZ

I see, That means it’s likely to be faster in the julia version

azev77 avatar Feb 10 '21 17:02 azev77

Thanks for the great package! Would it be possible to change 'CompressedPredictorMatrix' to a mutable struct? This would allow modifying the predicted values and implementing the relaxed lasso.

AdaemmerP avatar Dec 01 '21 11:12 AdaemmerP

I think that should be fine, although it might be better to update the code to use a generic sparse matrix instead rather than the custom struct. I'm not too familiar with the internals of the package but it feels like that should be possible?

JackDunnNZ avatar Dec 01 '21 15:12 JackDunnNZ

Yes, a sparse matrix might be better to save the parameters. Regarding the struct, I think line 90 should be changed to a mutable struct: https://github.com/JuliaStats/GLMNet.jl/blob/master/src/GLMNet.jl Or is there any other possibility to change and save the values? I want to modify the parameters and then use them with GLMNet.predict()

AdaemmerP avatar Dec 01 '21 15:12 AdaemmerP

Or is there any other possibility to change and save the values?

You may be able to use Setfield.jl or Accessors.jl to easily update the GLMNetPath with new coefficients, something like

new_path = @set path.betas = new_betas

JackDunnNZ avatar Dec 01 '21 15:12 JackDunnNZ

Nice, thanks for the tip!

AdaemmerP avatar Dec 01 '21 15:12 AdaemmerP

@AdaemmerP did you have any luck implementing the relaxed Lasso?

azev77 avatar Sep 02 '22 17:09 azev77

@azev77 Yes, I was able to implement but within a time series framework (https://github.com/AdaemmerP/DetectSparsity/blob/main/CaseStudies/Functions.jl, lines 337 - 501). I also used the Lasso.jl package for it.

AdaemmerP avatar Sep 13 '22 06:09 AdaemmerP