You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Fix solve interface to use CommonSolve dispatches correctly
The `__init` and `__solve` functions are internal hooks, but the actual
dispatches should be to CommonSolve.jl's `init`, `solve`, and `solve!`
functions (which are imported via SciMLBase).
Changes:
- Import `init`, `solve`, `solve!`, `__init`, and `__solve` from SciMLBase
- Change function definitions from `SciMLBase.solve` to `solve` to properly
extend the CommonSolve interface
- Remove `SciMLBase.` prefix from function calls to use the imported functions
directly
- Keep type annotations with `SciMLBase.` prefix (these are correct)
This matches the pattern used in other SciML packages like OrdinaryDiffEq.jl
and aligns with the CommonSolve.jl interface design.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <[email protected]>
throw(IncompatibleOptimizerError("The algorithm $(typeof(alg)) does not support box constraints. Either remove the `lb` or `ub` bounds passed to `OptimizationProblem` or use a different algorithm."))
throw(IncompatibleOptimizerError("The algorithm $(typeof(alg)) requires box constraints. Either pass `lb` and `ub` bounds to `OptimizationProblem` or use a different algorithm."))
throw(IncompatibleOptimizerError("The algorithm $(typeof(alg)) does not support constraints. Either remove the `cons` function passed to `OptimizationFunction` or use a different algorithm."))
throw(IncompatibleOptimizerError("The algorithm $(typeof(alg)) does not support callbacks, remove the `callback` keyword argument from the `solve` call."))
130
-
SciMLBase.requiresgradient(alg) &&
130
+
requiresgradient(alg) &&
131
131
!(prob.f isa SciMLBase.AbstractOptimizationFunction) &&
132
132
throw(IncompatibleOptimizerError("The algorithm $(typeof(alg)) requires gradients, hence use `OptimizationFunction` to generate them with an automatic differentiation backend e.g. `OptimizationFunction(f, AutoForwardDiff())` or pass it in with `grad` kwarg."))
133
-
SciMLBase.requireshessian(alg) &&
133
+
requireshessian(alg) &&
134
134
!(prob.f isa SciMLBase.AbstractOptimizationFunction) &&
135
135
throw(IncompatibleOptimizerError("The algorithm $(typeof(alg)) requires hessians, hence use `OptimizationFunction` to generate them with an automatic differentiation backend e.g. `OptimizationFunction(f, AutoFiniteDiff(); kwargs...)` or pass them in with `hess` kwarg."))
136
-
SciMLBase.requiresconsjac(alg) &&
136
+
requiresconsjac(alg) &&
137
137
!(prob.f isa SciMLBase.AbstractOptimizationFunction) &&
138
138
throw(IncompatibleOptimizerError("The algorithm $(typeof(alg)) requires constraint jacobians, hence use `OptimizationFunction` to generate them with an automatic differentiation backend e.g. `OptimizationFunction(f, AutoFiniteDiff(); kwargs...)` or pass them in with `cons` kwarg."))
139
-
SciMLBase.requiresconshess(alg) &&
139
+
requiresconshess(alg) &&
140
140
!(prob.f isa SciMLBase.AbstractOptimizationFunction) &&
141
141
throw(IncompatibleOptimizerError("The algorithm $(typeof(alg)) requires constraint hessians, hence use `OptimizationFunction` to generate them with an automatic differentiation backend e.g. `OptimizationFunction(f, AutoFiniteDiff(), AutoFiniteDiff(hess=true); kwargs...)` or pass them in with `cons` kwarg."))
142
142
return
@@ -184,13 +184,13 @@ These arguments can be passed as `kwargs...` to `init`.
184
184
185
185
See also [`solve(prob::OptimizationProblem, alg, args...; kwargs...)`](@ref)
0 commit comments