Skip to content

Commit ad7990d

Browse files
update doc for TR
1 parent c389274 commit ad7990d

File tree

1 file changed

+53
-28
lines changed

1 file changed

+53
-28
lines changed

src/TR_alg.jl

Lines changed: 53 additions & 28 deletions
Original file line numberDiff line numberDiff line change
@@ -85,48 +85,73 @@ function TRSolver(
8585
end
8686

8787
"""
88+
TR(reg_nlp; kwargs…)
8889
TR(nlp, h, χ, options; kwargs...)
8990
9091
A trust-region method for the problem
9192
9293
min f(x) + h(x)
9394
94-
where f: ℝⁿ → ℝ has a Lipschitz-continuous Jacobian, and h: ℝⁿ → ℝ is
95-
lower semi-continuous and proper.
95+
where f: ℝⁿ → ℝ has a Lipschitz-continuous gradient, and h: ℝⁿ → ℝ is
96+
lower semi-continuous, proper and prox-bounded.
9697
9798
About each iterate xₖ, a step sₖ is computed as an approximate solution of
9899
99100
min φ(s; xₖ) + ψ(s; xₖ) subject to ‖s‖ ≤ Δₖ
100101
101102
where φ(s ; xₖ) = f(xₖ) + ∇f(xₖ)ᵀs + ½ sᵀ Bₖ s is a quadratic approximation of f about xₖ,
102103
ψ(s; xₖ) = h(xₖ + s), ‖⋅‖ is a user-defined norm and Δₖ > 0 is the trust-region radius.
103-
The subproblem is solved inexactly by way of a first-order method such as the proximal-gradient
104-
method or the quadratic regularization method.
105104
106-
### Arguments
107-
108-
* `nlp::AbstractNLPModel`: a smooth optimization problem
109-
* `h`: a regularizer such as those defined in ProximalOperators
110-
* `χ`: a norm used to define the trust region in the form of a regularizer
111-
* `options::ROSolverOptions`: a structure containing algorithmic parameters
112-
113-
The objective, gradient and Hessian of `nlp` will be accessed.
114-
The Hessian is accessed as an abstract operator and need not be the exact Hessian.
115-
116-
### Keyword arguments
117-
118-
* `x0::AbstractVector`: an initial guess (default: `nlp.meta.x0`)
119-
* `subsolver_logger::AbstractLogger`: a logger to pass to the subproblem solver (default: the null logger)
120-
* `subsolver`: the procedure used to compute a step (`PG`, `R2` or `TRDH`)
121-
* `subsolver_options::ROSolverOptions`: default options to pass to the subsolver (default: all defaut options)
122-
* `selected::AbstractVector{<:Integer}`: (default `1:f.meta.nvar`).
123-
124-
### Return values
125-
126-
* `xk`: the final iterate
127-
* `Fobj_hist`: an array with the history of values of the smooth objective
128-
* `Hobj_hist`: an array with the history of values of the nonsmooth objective
129-
* `Complex_hist`: an array with the history of number of inner iterations.
105+
For advanced usage, first define a solver "TRSolver" to preallocate the memory used in the algorithm, and then call `solve!`:
106+
107+
solver = TR(reg_nlp; χ = NormLinf(1), subsolver = R2Solver)
108+
solve!(solver, reg_nlp)
109+
110+
stats = RegularizedExecutionStats(reg_nlp)
111+
solve!(solver, reg_nlp, stats)
112+
113+
# Arguments
114+
* `reg_nlp::AbstractRegularizedNLPModel{T, V}`: the problem to solve, see `RegularizedProblems.jl`, `NLPModels.jl`.
115+
116+
# Keyword arguments
117+
- `x::V = nlp.meta.x0`: the initial guess;
118+
- `atol::T = √eps(T)`: absolute tolerance;
119+
- `rtol::T = √eps(T)`: relative tolerance;
120+
- `neg_tol::T = eps(T)^(1 / 4)`: negative tolerance;
121+
- `max_eval::Int = -1`: maximum number of evaluation of the objective function (negative number means unlimited);
122+
- `max_time::Float64 = 30.0`: maximum time limit in seconds;
123+
- `max_iter::Int = 10000`: maximum number of iterations;
124+
- `verbose::Int = 0`: if > 0, display iteration details every `verbose` iteration;
125+
- `Δk::T = T(1)`: initial value of the trust-region radius;
126+
- `η1::T = √√eps(T)`: successful iteration threshold;
127+
- `η2::T = T(0.9)`: very successful iteration threshold;
128+
- `γ::T = T(3)`: trust-region radius parameter multiplier, Δ := Δ*γ when the iteration is very successful and Δ := Δ/γ when the iteration is unsuccessful;
129+
- `α::T = 1/eps(T)`: TODO
130+
- `β::T = 1/eps(T)`: TODO
131+
- `χ::F = NormLinf(1)`: norm used to define the trust-region;`
132+
- `subsolver::S = R2Solver`: subsolver used to solve the subproblem that appears at each iteration.
133+
134+
The algorithm stops either when `√(ξₖ/νₖ) < atol + rtol*√(ξ₀/ν₀) ` or `ξₖ < 0` and `√(-ξₖ/νₖ) < neg_tol` where ξₖ := f(xₖ) + h(xₖ) - φ(sₖ; xₖ) - ψ(sₖ; xₖ), and √(ξₖ/νₖ) is a stationarity measure.
135+
136+
# Output
137+
The value returned is a `GenericExecutionStats`, see `SolverCore.jl`.
138+
139+
# Callback
140+
The callback is called at each iteration.
141+
The expected signature of the callback is `callback(nlp, solver, stats)`, and its output is ignored.
142+
Changing any of the input arguments will affect the subsequent iterations.
143+
In particular, setting `stats.status = :user` will stop the algorithm.
144+
All relevant information should be available in `nlp` and `solver`.
145+
Notably, you can access, and modify, the following:
146+
- `solver.xk`: current iterate;
147+
- `solver.∇fk`: current gradient;
148+
- `stats`: structure holding the output of the algorithm (`GenericExecutionStats`), which contains, among other things:
149+
- `stats.iter`: current iteration counter;
150+
- `stats.objective`: current objective function value;
151+
- `stats.solver_specific[:smooth_obj]`: current value of the smooth part of the objective function;
152+
- `stats.solver_specific[:nonsmooth_obj]`: current value of the nonsmooth part of the objective function;
153+
- `stats.status`: current status of the algorithm. Should be `:unknown` unless the algorithm has attained a stopping criterion. Changing this to anything other than `:unknown` will stop the algorithm, but you should use `:user` to properly indicate the intention;
154+
- `stats.elapsed_time`: elapsed time in seconds.
130155
"""
131156
function TR(
132157
f::AbstractNLPModel,

0 commit comments

Comments
 (0)