You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Fix NLopt crash with gradient-based algorithms when no AD backend specified
This fixes the issue reported in https://discourse.julialang.org/t/error-when-using-multistart-optimization/133174
## Problem
When using NLopt's gradient-based algorithms (like LD_LBFGS) without specifying
an AD backend in OptimizationFunction, the code would crash with:
`MethodError: objects of type Nothing are not callable`
This occurred because the NLopt wrapper tried to call `cache.f.grad(G, θ)` at
line 181, but `cache.f.grad` was `nothing` when no AD backend was specified.
## Solution
Added a check in the `__solve` method to verify that if the algorithm requires
gradients, `cache.f.grad` is not `nothing`. If it is `nothing`, we now throw
a helpful `IncompatibleOptimizerError` that guides users to:
1. Use `OptimizationFunction` with an AD backend (e.g., `AutoForwardDiff()`)
2. Or provide gradients manually via the `grad` kwarg
## Changes
1. **OptimizationNLopt.jl**: Added gradient availability check before attempting
to use gradients, providing a clear error message for users
2. **runtests.jl**: Added comprehensive tests to verify:
- Error is thrown when gradient-based algorithms are used without AD
- Error is thrown with both `NLopt.LD_LBFGS()` and `NLopt.Opt(:LD_LBFGS, 2)`
- Gradient-free algorithms still work without AD backend
- Gradient-based algorithms work correctly when AD is provided
3. **multistartoptimization.md**: Fixed documentation example to include AD backend
## Test Results
All tests pass, including the new test that reproduces the discourse issue.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <[email protected]>
0 commit comments