Unexpected behavior when trying to minimize bitvector #429
-
I have code which builds up some constraints over bitvectors and then tries to minimize the value of one of them. Code to reproduce the issue can be found at https://github.com/danielrainer/z3_rust_minimize_repro What I expected my code to do:
What happens:
Am I misusing the API, do I misunderstand some semantics, or is this a bug? For reference, the version of Z3 installed on my system is
When I extend the constraints file by appending
and directly invoking Z3 from the command line, the resulting model has the expected value of |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
Looking over this, I think I see the issue. You are using both |
Beta Was this translation helpful? Give feedback.
Looking over this, I think I see the issue. You are using both
Solver
andOptimize
, but these two types are independent of each other: usingOptimize::minimize
has no effect on theSolver::check
andSolver::from_string
has no effect onOptimize::minimize
.Optimize
itself is essentially a flavor ofSolver
, which also supports minimization/maximization. It has its ownfrom_string
and its owncheck
. So for your code to work as intended, you should be doing the assertions and checking in theOptimize
itself.