Update ParOpt interface to support sparse constraints
Purpose
ParOpt now supports sparse constraints which should help some of the performance issues I've run into with it for problems with a lot of constraints.
To use this functionality, the ParOpt problem you create needs to define new methods for computing the sparse constraints separately and another for computing jac-vec products with the sparse constraint jacobian.
As a first step, I figured the best approach would be to treat the linear constraints as sparse and the nonlinear constraints as dense.
I'm opening this as a draft now as I'm not sure the best way to separate the evaluation of the linear and nonlinear constraints, hoping to get @ewu63 's opinion. In particular, I think ParOpt will call the jac-vec product function quite a lot so it might be important to make that quite efficient, ideally we would just have a single sparse mat for all the linear constraints and then do a single mat-vec product.
Closes https://github.com/mdolab/pyoptsparse/issues/353
Expected time until merged
Type of change
- [ ] Bugfix (non-breaking change which fixes an issue)
- [x] New feature (non-breaking change which adds functionality)
- [ ] Breaking change (non-backwards-compatible fix or feature)
- [ ] Code style update (formatting, renaming)
- [ ] Refactoring (no functional changes, no API changes)
- [ ] Documentation update
- [ ] Maintenance update
- [ ] Other (please describe)
Testing
Checklist
- [ ] I have run
flake8andblackto make sure the Python code adheres to PEP-8 and is consistently formatted - [ ] I have formatted the Fortran code with
fprettifyor C/C++ code withclang-formatas applicable - [ ] I have run unit and regression tests which pass locally with my changes
- [ ] I have added new tests that prove my fix is effective or that my feature works
- [ ] I have added necessary documentation
@gjkennedy just tagging to make you aware of this
Codecov Report
Attention: Patch coverage is 50.00000% with 2 lines in your changes missing coverage. Please review.
Project coverage is 74.89%. Comparing base (
bc021e4) to head (9011d96). Report is 2 commits behind head on main.
| Files with missing lines | Patch % | Lines |
|---|---|---|
| pyoptsparse/pyParOpt/ParOpt.py | 50.00% | 2 Missing :warning: |
Additional details and impacted files
@@ Coverage Diff @@
## main #409 +/- ##
==========================================
- Coverage 74.92% 74.89% -0.03%
==========================================
Files 22 22
Lines 3334 3338 +4
==========================================
+ Hits 2498 2500 +2
- Misses 836 838 +2
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
@gjkennedy I see there are two different ways to work with sparse constraints in ParOpt:
- Implement
evalSparseObjConandevalSparseObjConGradientso that the sparse constraints and jacobian are returned along with the dense constraints - Implement separate
evalSparseConandaddSparseJacobianmethods just for evaluating the sparse constraints.
Is either one of these approaches preferable from ParOpt's point of view? Also, in option 1 I see that evalSparseObjConGradient directly populates the data array of the ParOpt sparse jacobian, how/when is the sparsity pattern supposed to be given to ParOpt?
Closing as this is superceded by #414