MLJTuning.jl icon indicating copy to clipboard operation
MLJTuning.jl copied to clipboard

Frameworks for HP optimization

Open azev77 opened this issue 5 years ago • 4 comments

Julia HP optimization packages:

  • [ ] Hyperopt.jl @baggepinnen (Random search, Latin hypercube sampling, Bayesian opt)
  • [ ] TreeParzen.jl (port of Hyperopt.py to Julia) @IQVIA-ML @iqml
  • [ ] NaiveGAflux.jl (helps automate Flux) @DrChainsaw

Other HP optimization packages:

There are projects that benchmark different AutoML systems: https://openml.github.io/automlbenchmark/ From our conversation: https://github.com/alan-turing-institute/MLJ.jl/issues/416#issuecomment-640823116 I wanted to tell you guys about Optuna (repo & paper) a new framework for HP optimization. A nice comparison w/ Hyperopt shows what can be done for HP visualization: https://neptune.ai/blog/optuna-vs-hyperopt

Here are a few snips: image

image

A 3 minute clip: https://www.youtube.com/watch?v=-UeC4MR3PHM

It would really be amazing for MLJ to incorporate this!

azev77 avatar Jun 09 '20 04:06 azev77

Hyperopt.py (also Hyperopt.jl)

To clarify, Hyperopt.jl is not related to the Python hyperopt. It uses different optimisation techniques (random search, latin hypercube sampling and Bayesian optimization) and deserves its own position in the list.

However TreeParzen.jl is a direct port of the Python hyperopt to Julia, uses the same optimisation technique (tree-parzen estimators), and has the same behaviour.

ghost avatar Jun 09 '20 06:06 ghost

Maybe it is worth considering bandit frame works Ax

vollmersj avatar Jul 04 '20 18:07 vollmersj

Thanks. @vollmersj, added. Please let me know if you have other suggestions

azev77 avatar Jul 04 '20 23:07 azev77

Hi, I'm really excited to see a Bayesian optimization method for hyperparameter tuning! I note that RandomSerach() and LatinHypercube() are already possible choices for the tuning = kwarg of TunedModel(), and I see them grouped together with Bayesian opt in the original post of this issue. Is it already possible to implement this method for hyperparameter tuning/is there any notion of whether it will be out soon/how soon? :-)

Thanks a lot for basically everything so far!

casasgomezuribarri avatar Feb 05 '21 15:02 casasgomezuribarri