You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To model $f$, we are going to use [LLSModels.jl](https://github.com/JuliaSmoothOptimizers/LLSModels.jl).
22
+
To model $f$, we are going to use [ADNLPModels.jl](https://github.com/JuliaSmoothOptimizers/ADNLPModels.jl).
22
23
For the nonsmooth regularizer, we observe that $h$ is actually readily available in [ProximalOperators.jl](https://github.com/JuliaFirstOrder/ProximalOperators.jl), you can refer to [this section](@ref regularizers) for a list of readily available regularizers.
23
24
We then wrap the smooth function and the regularizer in a `RegularizedNLPModel`.
24
25
25
-
```@example
26
-
using LLSModels
26
+
```@example ls
27
+
using ADNLPModels
27
28
using ProximalOperators
28
29
using Random
29
30
using RegularizedProblems
30
31
31
32
Random.seed!(0)
32
33
33
-
# Generate A, b
34
-
m, n = 5, 10
35
-
A = randn((m, n))
36
-
b = randn(m)
34
+
# Generate synthetic nonlinear least-squares data
35
+
m = 100
36
+
t = range(0, 1, length=m)
37
+
a_true, b_true = 2.0, -1.0
38
+
y = [a_true * exp(b_true * ti) + 0.1*randn() for ti in t]
37
39
38
-
# Choose a starting point for the optimization process
39
-
x0 = randn(n)
40
+
# Starting point
41
+
x0 = [1.0, 0.0] # [a, b]
40
42
41
-
# Get an NLSModel corresponding to the smooth function f
42
-
f_model = LLSModel(A, b, x0 = x0, name = "NLS model of f")
43
+
# Define nonlinear residuals
44
+
function F(x)
45
+
a, b = x
46
+
return [yi - a*exp(b*ti) for (ti, yi) in zip(t, y)]
47
+
end
48
+
49
+
# Build ADNLSModel
50
+
f_model = ADNLSModel(F, x0, m, name = "nonlinear LS model of f")
0 commit comments