Skip to content

Commit 8c8947b

Browse files
update least squares example
1 parent 4bb680d commit 8c8947b

File tree

1 file changed

+30
-38
lines changed

1 file changed

+30
-38
lines changed

docs/src/examples/ls.md

Lines changed: 30 additions & 38 deletions
Original file line numberDiff line numberDiff line change
@@ -1,45 +1,53 @@
1-
# A regularized least-square problem
1+
# A regularized nonlinear least-square problem
22

33
In this tutorial, we will show how to model and solve the nonconvex nonsmooth least-square problem
44
```math
5-
\min_{x \in \mathbb{R}^n} \tfrac{1}{2} \|Ax - b\|_2^2 + \lambda \|x\|_0.
5+
\min_{x \in \mathbb{R}^2} \tfrac{1}{2} \sum_{i=1}^m \big(y_i - x_1 e^{x_2 t_i}\big)^2 + \lambda \|x\|_0.
66
```
7+
This problem models the fitting of an exponential curve, given noisy data.
78

89
## Modelling the problem
910
We first formulate the objective function as the sum of a smooth function $f$ and a nonsmooth regularizer $h$:
1011
```math
11-
\tfrac{1}{2} \|Ax - b\|_2^2 + \lambda \|x\|_0 = f(x) + h(x),
12+
\tfrac{1}{2} \sum_{i=1}^m \big(y_i - x_1 e^{x_2 t_i}\big)^2 + \lambda \|x\|_0 = f(x) + h(x),
1213
```
1314
where
1415
```math
1516
\begin{align*}
16-
f(x) &:= \tfrac{1}{2} \|Ax - b\|_2^2,\\
17+
f(x) &:= \tfrac{1}{2} \sum_{i=1}^m \big(y_i - x_1 e^{x_2 t_i}\big)^2,\\
1718
h(x) &:= \lambda\|x\|_0.
1819
\end{align*}
1920
```
2021

21-
To model $f$, we are going to use [LLSModels.jl](https://github.com/JuliaSmoothOptimizers/LLSModels.jl).
22+
To model $f$, we are going to use [ADNLPModels.jl](https://github.com/JuliaSmoothOptimizers/ADNLPModels.jl).
2223
For the nonsmooth regularizer, we observe that $h$ is actually readily available in [ProximalOperators.jl](https://github.com/JuliaFirstOrder/ProximalOperators.jl), you can refer to [this section](@ref regularizers) for a list of readily available regularizers.
2324
We then wrap the smooth function and the regularizer in a `RegularizedNLPModel`.
2425

25-
```@example
26-
using LLSModels
26+
```@example ls
27+
using ADNLPModels
2728
using ProximalOperators
2829
using Random
2930
using RegularizedProblems
3031
3132
Random.seed!(0)
3233
33-
# Generate A, b
34-
m, n = 5, 10
35-
A = randn((m, n))
36-
b = randn(m)
34+
# Generate synthetic nonlinear least-squares data
35+
m = 100
36+
t = range(0, 1, length=m)
37+
a_true, b_true = 2.0, -1.0
38+
y = [a_true * exp(b_true * ti) + 0.1*randn() for ti in t]
3739
38-
# Choose a starting point for the optimization process
39-
x0 = randn(n)
40+
# Starting point
41+
x0 = [1.0, 0.0] # [a, b]
4042
41-
# Get an NLSModel corresponding to the smooth function f
42-
f_model = LLSModel(A, b, x0 = x0, name = "NLS model of f")
43+
# Define nonlinear residuals
44+
function F(x)
45+
a, b = x
46+
return [yi - a*exp(b*ti) for (ti, yi) in zip(t, y)]
47+
end
48+
49+
# Build ADNLSModel
50+
f_model = ADNLSModel(F, x0, m, name = "nonlinear LS model of f")
4351
4452
# Get the regularizer from ProximalOperators
4553
λ = 1.0
@@ -52,34 +60,18 @@ regularized_pb = RegularizedNLPModel(f_model, h)
5260
## Solving the problem
5361
We can now choose one of the solvers presented [here](@ref algorithms) to solve the problem we defined above.
5462
In the case of least-squares, it is usually more appropriate to choose LM or LMTR.
55-
```@example
56-
using LLSModels
57-
using ProximalOperators
58-
using Random
59-
using RegularizedProblems
60-
61-
Random.seed!(0)
62-
63-
m, n = 5, 10
64-
λ = 0.1
65-
A = randn((m, n))
66-
b = randn(m)
67-
68-
x0 = 10*randn(n)
69-
70-
f_model = LLSModel(A, b, x0 = x0, name = "NLS model of f")
71-
h = NormL0(λ)
72-
regularized_pb = RegularizedNLPModel(f_model, h)
7363

64+
```@example ls
7465
using RegularizedOptimization
7566
76-
# LM is a quadratic regularization method, we specify the verbosity and the tolerance of the solver
77-
out = LM(regularized_pb, verbose = 1, atol = 1e-3)
67+
# LM is a quadratic regularization method.
68+
out = LM(regularized_pb, verbose = 1, atol = 1e-4)
7869
println("LM converged after $(out.iter) iterations.")
79-
println("--------------------------------------------------------------------------------------")
70+
```
8071

81-
# We can choose LMTR instead which is a trust-region method
82-
out = LMTR(regularized_pb, verbose = 1, atol = 1e-3)
72+
```@example ls
73+
#We can choose LMTR instead which is a trust-region method
74+
out = LMTR(regularized_pb, verbose = 1, atol = 1e-4)
8375
println("LMTR converged after $(out.iter) iterations.")
8476
8577
```

0 commit comments

Comments
 (0)