Skip to content

Commit 48dd731

Browse files
jduerholtfacebook-github-bot
authored andcommitted
Tidy-up feasibility evaluation in optimize_mixed_alternating (#2952)
Summary: <!-- Thank you for sending the PR! We appreciate you spending the time to make BoTorch better. Help us understand your motivation by explaining why you decided to make this change. You can learn more about contributing to BoTorch here: https://github.com/pytorch/botorch/blob/main/CONTRIBUTING.md --> ## Motivation As discussed with esantorella in #2944 (comment), this PR tidies-up the feasibility evaluation in `optimize_mixed_alternating.` It is now refactored in a way that it uses `optim.parameter_constraints.evaluate_feasibility`, which improves code sharing. Regarding nonlinear inequality constraints that were also discussed in the thread above, I faced the problem that nonlinear constraints only work with a batch limit of 1, which would render `optimize_mixed_alternating` highly inefficient as it makes use of batch optimization. So, I think, we have to first get rid of the limitation of `batch_limit==1` for nonlinear constraints. Should I create a separate issue for this? ### Have you read the [Contributing Guidelines on pull requests](https://github.com/pytorch/botorch/blob/main/CONTRIBUTING.md#pull-requests)? Yes. Pull Request resolved: #2952 Test Plan: Unit tests. Reviewed By: Balandat Differential Revision: D79644900 Pulled By: saitcakmak fbshipit-source-id: d7bfa0905ffc489f79d815f97a4444f5914e8915
1 parent 359d9ee commit 48dd731

File tree

1 file changed

+9
-11
lines changed

1 file changed

+9
-11
lines changed

botorch/optim/optimize_mixed.py

Lines changed: 9 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -22,6 +22,7 @@
2222
_validate_sequential_inputs,
2323
OptimizeAcqfInputs,
2424
)
25+
from botorch.optim.parameter_constraints import evaluate_feasibility
2526
from botorch.optim.utils.acquisition_utils import fix_features, get_X_baseline
2627
from botorch.utils.sampling import (
2728
draw_sobol_samples,
@@ -134,17 +135,14 @@ def _filter_infeasible(
134135
Returns:
135136
The tensor `X` with infeasible points removed.
136137
"""
137-
if inequality_constraints is None and equality_constraints is None:
138-
return X
139-
is_feasible = torch.ones(X.shape[:-1], device=X.device, dtype=torch.bool)
140-
if inequality_constraints is not None:
141-
for idx, coef, rhs in inequality_constraints:
142-
is_feasible &= (X[..., idx] * coef).sum(dim=-1) >= rhs
143-
if equality_constraints is not None:
144-
for idx, coef, rhs in equality_constraints:
145-
is_feasible &= torch.isclose(
146-
(X[..., idx] * coef).sum(dim=-1), torch.tensor(rhs).to(coef)
147-
)
138+
# X is reshaped to [n, 1, d] in order to be able to apply
139+
# `evaluate_feasibility` which operates on the batch level
140+
is_feasible = evaluate_feasibility(
141+
X=X.unsqueeze(-2),
142+
inequality_constraints=inequality_constraints,
143+
equality_constraints=equality_constraints,
144+
nonlinear_inequality_constraints=None,
145+
)
148146
return X[is_feasible]
149147

150148

0 commit comments

Comments
 (0)