Fabian Pedregosa
Fabian Pedregosa
jit and unroll don't seem to be documented in https://jaxopt.github.io/stable/_autosummary/jaxopt.GradientDescent.html#jaxopt-gradientdescent
For example, in LBFGS  Reading the Return type, it would seem like this returns an object of type OptStep, while in reality, it returns a tuple, of which the...
these are the examples on docs/notebooks/ Probably it's necessary to add this line when running on TPUs https://github.com/google/jax#pip-installation-colab-tpu I think this would be a good first issue
https://jaxopt.github.io/stable/auto_examples/implicit_diff/plot_dataset_distillation.html#sphx-glr-auto-examples-implicit-diff-plot-dataset-distillation-py
standard baselines are able to reach 90+ accuracy without much tweaking (see for example https://github.com/kuangliu/pytorch-cifar ), while our example never goes beyond 70% accuracy on the validation set
when running the deep learning examples (say) deep_learning/flax_image_classif.py , the GPU utilization is never above 5%, while for the equivalent flax example the GPU utilization is around 90%, and the...
As far as I understand, there is currently no way to have the examples simultaneously adhere to [PEP 257](https://www.python.org/dev/peps/pep-0257/) and have the title recognized correctly by sphinx-gallery. In particular, the...
because its cool