Geoffrey Negiar

Results 50 issues of Geoffrey Negiar

Opening this as a nota bene. When optimizing over complex parameters, the gradient must be conjugated. Currently, all jaxopt optimizers would be incorrect on complex parameters, due to this. Moreover,...

bug

This allows to precompute a preconditioner, and share it across multiple outer loops, where the inner loop is solving an Equality Constrained QP. This should provide speedups when the parameters...

I benchmarked GMRES (currently default in jax.eq_qp) vs other `scipy` solvers: [minres](https://docs.scipy.org/doc/scipy/reference/generated/scipy.sparse.linalg.minres.html#scipy.sparse.linalg.minres) and [LGMRES](https://docs.scipy.org/doc/scipy/reference/generated/scipy.sparse.linalg.lgmres.html#scipy.sparse.linalg.lgmres). I sampled random equality constrained QP KKT matrices and targets, and found pretty stark differences between...

Hi, Thanks for the cool package! I've just started playing around with it on a QP I need to differentiate through. I'm getting the following error when calling `jax.jacobian` on...

I'm trying to normalize a string using your parser, and replace all dates with a "" token. I can't find an easy way to do so, any suggestions?

It seems like the current implementation doesn't allow broadcasting arguments. Here's an example for normalizing leafs. ```python import tree_math as tm import jax import jax.numpy as jnp a = jnp.ones(10)...

Seems interesting that the sublinear step size never chooses the same two vertices in a row. My guess is that this is due to zigzagging behavior. ![image](https://user-images.githubusercontent.com/11814989/85133774-9cb2ce80-b23b-11ea-919c-fe1b073ff181.png)

Parallelization for the sparse matrix multiplication, row-wise. Cf [this thread](https://stackoverflow.com/questions/46924092/how-to-parallelize-this-python-for-loop-when-using-numba), which announces nice speed ups.

enhancement
performance

It would be nice to have an example to compare speed of convergence of SAGA/SVRG/SFW on problems attaining the same optimum.

enhancement