Check for Conditioning of the problem
Check if the Jacobian of the cost function and the Jacobian and Hessian of the constraints is well-conditioned. Otherwise raise a warning (flag in solve, or in an ocp.method)
perhaps some ideas in : http://faculty.bscb.cornell.edu/~hooker/zebra_desc2.pdf
We should also use visualization tools for these matrices, as in https://gitlab.kuleuven.be/meco-software/rockit
perhaps some ideas in : http://faculty.bscb.cornell.edu/~hooker/zebra_desc2.pdf
Main tricks from this article : • Imposing loose boundary conditions on the state variables. Although these are not active in the final solution, they help improve the conditioning of the system at intermediate steps. • Successively increase the set of parameters to be optimized. This helps to keep the Hessian positive definite and provides a sequence of good initialization values for the next optimization problem. • Consider transformations of the system. These have not been employed here, but the log transform dx=f(x), y=log(x)⇒ dy=e−yf(ey) dt dt is commonly found to make system trajectories easier to solve, particularly when they involve large values of f(x).
The AMPL (https://ampl.com) interface for IPOPT is used for parameter estimation, diagnostics and system modifications.
From #456 : Collaborator EveCharbie commented 8 days ago
Add a check before optimization for "how well is written the problem", which would be useful for debugging. I think it would go as follow: Jacobain rank - dimensions = number of non-linearly independent constraints. @EveCharbie Collaborator Author EveCharbie commented 8 days ago
Also, if we follow this path, I would suggest we add a check for the convexity of the objective function using the second derivative being positive for the entire domain. And a check for the equality constraints (being linear I think)
@julo0
#579 last pb is here.