Benchmarking against other differentiable convex optimization softwares
Although not now, some day down the road it might be good to benchmark our code (in terms of speed / supported problem types) against other existing solutions. Some possible candidates:
- https://github.com/cvxgrp/cvxpylayers
- https://github.com/locuslab/qpth
how could i forget :sweat_smile:
- https://github.com/cvxgrp/diffcp
yes agreed, setting up benchmark can be quite some work, a good starting point will be setting up tests (checking identical solutions) :)
using QPs from http://qplib.zib.de/ Open to any other references!
See https://github.com/jump-dev/DiffOpt.jl/issues/233 for a report and code that DiffOpt is much faster than alternatives. I think we need to double check what we're benchmarking and how.