Ax icon indicating copy to clipboard operation
Ax copied to clipboard

[GENERAL SUPPORT]: Challenges in Multi-Objective Bayesian Optimization with ~60 Parameters and ~30 Objectives

Open vinaysaini94 opened this issue 2 months ago • 2 comments

Question

“I’m working on a Multi-Objective Bayesian Optimization (MOBO) problem involving a system with roughly 60 input parameters and around 30 performance evaluation metrics that we would ideally like to optimize simultaneously. Using BoTorch/Ax, treating all 30 metrics as separate objectives becomes computationally challenging: training many independent GP surrogates does not scale well, and hypervolume-based acquisition functions such as qNEHVI degrade or become extremely slow in high-dimensional objective spaces. Has anyone dealt with MOBO in the many-objective setting (15–30 objectives)? What methods would you recommend?

Please provide any relevant code snippet if applicable.


Code of Conduct

  • [x] I agree to follow this Ax's Code of Conduct

vinaysaini94 avatar Dec 01 '25 04:12 vinaysaini94

Not only is using GPs + qEHVI slow in this regime, it's also not clear how you'd use the result effectively even if you could compute it faster? How are you going to reason about a 30-dim Pareto frontier? Is it even realistic to explore this reasonably well unless you have a very large budget (in which case another, higher-throughput MOO method may be better suited)?

In practice, when we face issues like this with many outcomes we usually have some decision maker that ultimately needs to judge the outcome of the optimization, we usually do some kind of preference learning akin to https://arxiv.org/abs/2203.11382 (and references therein).

Balandat avatar Dec 01 '25 15:12 Balandat

cc @ItsMrLin re the need for improved preference learning support in Ax :)

Balandat avatar Dec 01 '25 15:12 Balandat