Leo
Leo
So here is a workaround to solve this problem: I wrapped the models that need to share hyper-parameters into a composite model that also has the model's hyper-parameters. Then I...
Thanks a lot for this detailed answer @OkonSamuel . If I understand, your idea, is to wrap hyper-parameters inside a model, so that the default update function for composite models...
So the only option is to re-implement an update method? I have read the implementation of the default update method, and particularly > If any `model` field has been replaced...
Yes I was executing the ùpdate` function line by line, and it seems that in our case, > !issubset(submodel_ids, network_model_ids) returns true (thus refitting everything) because the submodels are not...
Ok I just realized something realy stupid that changes everything: if two different models have the same object as hyper-parameter, then they will be automaticly synchronised. Moreover, the other problem...
That's correct. This is ok for my use-case though as I needing this specific feature for a particular custom model, It will indeed not work for already implemented models. I...
Due to how flexible the code was implemented, it was not that hard to figure this out! Just to sum up before begining, the idea is always to "tie" hyper-parameters...
Thanks for showing enthousiasm regarding this idea. I had no idea this concept was call "diagonal" ranges. First of all, my use case of diagonal ranges occurs on custom models...
So the following: ```julia tuning = RandomSearch() self_tuning_tree_model = TunedModel(model=tree_model, tuning=tuning, resampling=CV(nfolds=3), range=r, measure=rms, n=10); self_tuning_tree = machine(self_tuning_tree_model, X, y); fit!(self_tuning_tree, verbosity=0); # Inspecting results fitted_params(self_tuning_tree).best_model report(self_tuning_tree).plotting.parameter_values ``` Is working...
Hi, Sorry I have been on a break for 2 weeks. I am very happy to see that my work has proved to be useful for the MLJ workflow. As...