Daniel Schmitz
Daniel Schmitz
Thank you, Daniel, will give you guys a heads up here when I publish on github!
Hey guys, my package is available on Github now under Apache 2 license: https://github.com/dschmitz89/simplenlopt Docu: https://simplenlopt.readthedocs.io/en/latest/index.html Some parts still need some polishing, especially a PyPI build. Let me know if...
> Thank you @dschmitz89. I am not sure about saying SHGO is not stochastic compared to DE for instance. The outcome of the algorithm also depends on the sampling strategy...
@Stefan-Endres : is it correct that the shgo's default `simplicial` sampling strategy generates deterministic results while the other ones are stochastic? If so, this should also maybe be stated in...
> > Then it would be necessary to add `kwargs` to all global optimizers? ` > > At minimum the stopping criteria should be added (in the benchmark case at...
> > however, I don't consider stopping criteria to be hyperparameters because it does not change the performance of the algorithms > > I don't think that's the only relevant...
> > An algorithm that can go into an infinite loop for particular values of input arguments is a design that's ill-suited for non-expert users. > > This should not...
@andyfaff : is this ready to go?
I cannot find them in CircleCI output either: https://circleci.com/api/v1.1/project/github/scipy/scipy/52636/output/104/0?file=true&allocation-id=62ec0478e838fa23a9d461f1-0-build%2F60B53137 I guess running the global optimzation benchmark requires an additional flag.
Thanks @mdhaber . On my end, DIRECT and SHGO alone worked for me too but I could not get the complete benchmarks to run including the stochastic solvers (set the...