Skip to content

Commit 7c333dd

Browse files
authored
Merge pull request #155 from SciML/square-baseshow
Fix errors from `Base.show` implementations
2 parents 7c058f7 + 23ee581 commit 7c333dd

File tree

6 files changed

+95
-29
lines changed

6 files changed

+95
-29
lines changed

docs/src/demos/damped_SHO.md

Lines changed: 22 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -132,6 +132,7 @@ chain = [Chain(
132132
Dense(dim_hidden, 1)
133133
) for _ in 1:dim_output]
134134
ps, st = Lux.setup(rng, chain)
135+
nothing # hide
135136
```
136137

137138
Since `Lux.setup` defaults to `Float32` parameters for `Dense` layers, we set up the bounds and parameters using `Float32` as well.
@@ -143,6 +144,9 @@ ps = ps |> ComponentArray |> f64
143144
st = st |> f64
144145
```
145146

147+
To train the model, we'll sample 1000 points from the domain using [`NeuralPDE.QuasiRandomTraining`](https://docs.sciml.ai/NeuralPDE/stable/manual/training_strategies/#NeuralPDE.QuasiRandomTraining).
148+
See the [NeuralPDE.jl docs](https://docs.sciml.ai/NeuralPDE/stable/) for more on how the `PDESystem` will be converted into an `OptimizationProblem`.
149+
146150
```@example SHO
147151
using NeuralPDE
148152
@@ -156,28 +160,40 @@ We now define our Lyapunov candidate structure along with the form of the Lyapun
156160

157161
For this example, let's use a Lyapunov candidate
158162
```math
159-
V(x) = \lVert \phi(x) \rVert^2 + \delta \log \left( 1 + \lVert x \rVert^2 \right),
163+
V(x) = \lVert \phi(x) \rVert^2 + \delta \log \left( 1 + \lVert x \rVert^2 \right).
160164
```
161-
which structurally enforces nonnegativity, but doesn't guarantee ``V([0, 0]) = 0``.
162-
We therefore don't need a term in the loss function enforcing ``V(x) > 0 \, \forall x \ne 0``, but we do need something enforcing ``V([0, 0]) = 0``.
163-
So, we use [`DontCheckNonnegativity(check_fixed_point = true)`](@ref).
164-
165-
To train for exponential stability we use [`ExponentialStability`](@ref), but we must specify the rate of exponential decrease, which we know in this case to be ``\zeta \omega_0``.
166165

167166
```@example SHO
168167
using NeuralLyapunov
169168
170169
# Define neural Lyapunov structure and corresponding minimization condition
171170
structure = NonnegativeStructure(dim_output; δ = 1.0f-6)
171+
```
172+
173+
This structure enforces nonnegativity, but doesn't guarantee ``V([0, 0]) = 0``.
174+
We therefore don't need a term in the loss function enforcing ``V(x) > 0 \, \forall x \ne 0``, but we do need something enforcing ``V([0, 0]) = 0``.
175+
So, we use [`DontCheckNonnegativity(check_fixed_point = true)`](@ref).
176+
177+
```@example SHO
172178
minimization_condition = DontCheckNonnegativity(check_fixed_point = true)
179+
```
173180

181+
To train for exponential stability we use [`ExponentialStability`](@ref), but we must specify the rate of exponential decrease, which we know in this case to be ``\zeta \omega_0``.
182+
183+
```@example SHO
174184
# Define Lyapunov decrease condition
175185
# Damped SHO has exponential stability at a rate of k = ζ * ω_0, so we train to certify that
176186
decrease_condition = ExponentialStability(prod(p))
187+
```
188+
189+
We package these in a `NeuralLyapunovSpecification` and use it to construct a `PDESystem`.
177190

191+
```@example SHO
178192
# Construct neural Lyapunov specification
179193
spec = NeuralLyapunovSpecification(structure, minimization_condition, decrease_condition)
194+
```
180195

196+
```@example SHO
181197
# Construct PDESystem
182198
@named pde_system = NeuralLyapunovPDESystem(dynamics, lb, ub, spec; p)
183199
```

docs/src/demos/policy_search.md

Lines changed: 8 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -132,23 +132,26 @@ Random.seed!(200)
132132

133133
In this example, we'll use the [`Pendulum`](@ref) model in [NeuralLyaupnovProblemLibrary.jl](../lib.md).
134134

135-
Since the angle ``\theta`` is periodic with period ``2\pi``, our box domain will be one period in ``\theta`` and an interval in ``\frac{d\theta}{dt}``.
136-
137135
```@example policy_search
138136
using ModelingToolkit, NeuralLyapunovProblemLibrary
139137
using ModelingToolkit: unbound_inputs
140138
141139
@named pendulum = Pendulum(; defaults = [0.5, 1.0])
142140
τ, = unbound_inputs(pendulum)
143141
pendulum = mtkcompile(pendulum; inputs = [τ], split = false)
142+
```
143+
144+
Since the angle ``\theta`` is periodic with period ``2\pi``, our box domain will be one period in ``\theta`` and an interval in ``\frac{d\theta}{dt}``.
145+
146+
```@example policy_search
147+
upright_equilibrium = [π, 0.0]
148+
144149
θ, ω = unknowns(pendulum)
145150
146151
bounds = [
147152
θ ∈ (0, 2π),
148153
ω ∈ (-2.0, 2.0)
149154
]
150-
151-
upright_equilibrium = [π, 0.0]
152155
```
153156

154157
We'll use an architecture that's ``2\pi``-periodic in ``\theta`` so that we can train on just one period of ``\theta`` and don't need to add any periodic boundary conditions.
@@ -182,6 +185,7 @@ chain = [Chain(
182185
ps, st = Lux.setup(rng, chain)
183186
ps = ps |> ComponentArray |> f64
184187
st = st |> f64
188+
nothing #hide
185189
```
186190

187191
```@example policy_search

docs/src/demos/roa_estimation.md

Lines changed: 14 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -127,6 +127,7 @@ chain = [Chain(
127127
ps, st = Lux.setup(rng, chain)
128128
ps = ps |> ComponentArray |> f64
129129
st = st |> f64
130+
nothing # hide
130131
```
131132

132133
```@example RoA
@@ -147,18 +148,28 @@ V(x) = \left( 1 + \lVert \phi(x) \rVert^2 \right) \log \left( 1 + \lVert x \rVer
147148
which structurally enforces positive definiteness.
148149
We therefore use [`DontCheckNonnegativity()`](@ref).
149150

150-
We only require asymptotic stability in this example, but we use [`make_RoA_aware`](@ref) to only penalize positive values of ``\dot{V}(x)`` when ``V(x) \le 1``.
151-
152151
```@example RoA
153152
using NeuralLyapunov
154153
155154
# Define neural Lyapunov structure
156155
structure = PositiveSemiDefiniteStructure(dim_output)
157156
minimization_condition = DontCheckNonnegativity()
157+
```
158158

159+
We only require asymptotic stability in this example, but we use [`make_RoA_aware`](@ref) to only penalize positive values of ``\dot{V}(x)`` when ``V(x) \le 1``.
160+
161+
```@example RoA
159162
# Define Lyapunov decrease condition
160-
decrease_condition = make_RoA_aware(AsymptoticStability())
163+
decrease_condition = AsymptoticStability()
164+
```
161165

166+
```@example RoA
167+
decrease_condition = make_RoA_aware(decrease_condition)
168+
```
169+
170+
We package these in a `NeuralLyapunovSpecification` and use it to construct a `PDESystem`.
171+
172+
```@example RoA
162173
# Construct neural Lyapunov specification
163174
spec = NeuralLyapunovSpecification(structure, minimization_condition, decrease_condition)
164175

src/conditions_specification.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ function Base.show(io::IO, s::NeuralLyapunovStructure)
4545
V = replace(V, r"\^2" => "²")
4646
println(io, " V(x) = ", V)
4747
catch e
48-
println(io, " V(x) = <could not display: $(e)>")
48+
println(io, " V(x) = <could not display: $e>")
4949
end
5050
try
5151
= string(s.(φ, Jφ, f, x, p, t, x_0))

src/decrease_conditions.jl

Lines changed: 32 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -66,12 +66,36 @@ function Base.show(io::IO, cond::LyapunovDecreaseCondition)
6666
println(io, "LyapunovDecreaseCondition")
6767

6868
if cond.check_decrease
69-
@variables x x_0 a V(..) (..)
70-
str = string(-cond.strength(x, x_0))
71-
rec = string(cond.rectifier(a))
72-
rm = string(cond.rate_metric(V(x), (x)))
73-
println(io, " Trains for $rm$str")
74-
print(io, " with approximation a ≤ 0 => $rec ≈ 0")
69+
print(io, " Trains for ")
70+
71+
try
72+
@variables x V(..) (..)
73+
rm = string(cond.rate_metric(V(x), (x)))
74+
# Replace ^2 with ²
75+
rm = replace(rm, r"\^2" => "²")
76+
print(io, "$rm")
77+
catch e
78+
print(io, "<could not display rate_metric(V(x), V̇(x)): $e> ≤ ")
79+
end
80+
81+
try
82+
@variables x x_0
83+
str = string(-cond.strength(x, x_0))
84+
# Replace ^2 with ²
85+
str = replace(str, r"\^2" => "²")
86+
println(io, "$str")
87+
catch e
88+
println(io, "<could not display strength(x, x_0): $e>")
89+
end
90+
91+
try
92+
@variables a
93+
rec = string(cond.rectifier(a))
94+
rec = replace(rec, r"\^2" => "²")
95+
print(io, " with approximation a ≤ 0 => $rec ≈ 0")
96+
catch e
97+
println(io, " with approximation a ≤ 0 => <could not display rectifier(a): $e> ≈ 0")
98+
end
7599
else
76100
print(io, " Does not train for decrease of V along trajectories")
77101
end
@@ -154,14 +178,14 @@ but differentiable approximations of this function may be employed.
154178
```jldoctest
155179
julia> AsymptoticStability()
156180
LyapunovDecreaseCondition
157-
Trains for V̇(x) ≤ -1.0e-6((x - x_0)^2)
181+
Trains for V̇(x) ≤ -1.0e-6((x - x_0)²)
158182
with approximation a ≤ 0 => max(0, a) ≈ 0
159183
160184
julia> softplus = (t) -> log(one(t) + exp(t));
161185
162186
julia> AsymptoticStability(C = 0.1, rectifier = softplus)
163187
LyapunovDecreaseCondition
164-
Trains for V̇(x) ≤ -0.1((x - x_0)^2)
188+
Trains for V̇(x) ≤ -0.1((x - x_0)²)
165189
with approximation a ≤ 0 => log(1 + exp(a)) ≈ 0
166190
```
167191
"""

src/minimization_conditions.jl

Lines changed: 18 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -56,11 +56,22 @@ function Base.show(io::IO, cond::LyapunovMinimizationCondition)
5656
println(io, "LyapunovMinimizationCondition")
5757

5858
if cond.check_nonnegativity
59-
@variables x x_0 a
60-
str = string(cond.strength(x, x_0))
61-
println(io, " Trains for V(x) ≥ $str")
62-
rec = string(cond.rectifier(a))
63-
println(io, " with approximation a ≤ 0 => $rec ≈ 0")
59+
try
60+
@variables x x_0
61+
str = string(cond.strength(x, x_0))
62+
str = replace(str, r"\^2" => "²")
63+
println(io, " Trains for V(x) ≥ $str")
64+
catch e
65+
println(io, " Trains for V(x) ≥ <could not display strength(x, x_0): $e>")
66+
end
67+
try
68+
@variables a
69+
rec = string(cond.rectifier(a))
70+
rec = replace(rec, r"\^2" => "²")
71+
println(io, " with approximation a ≤ 0 => $rec ≈ 0")
72+
catch e
73+
println(io, " with approximation a ≤ 0 => <could not display rectifier(a): $e> ≈ 0")
74+
end
6475
else
6576
println(io, " Does not train for nonnegativity of V(x)")
6677
end
@@ -111,15 +122,15 @@ exactly represents ``V(x) ≥ C \\lVert x - x_0 \\rVert^2``. ``C`` defaults to `
111122
```jldoctest
112123
julia> StrictlyPositiveDefinite()
113124
LyapunovMinimizationCondition
114-
Trains for V(x) ≥ 1.0e-6((x - x_0)^2)
125+
Trains for V(x) ≥ 1.0e-6((x - x_0)²)
115126
with approximation a ≤ 0 => max(0, a) ≈ 0
116127
Trains for V(x_0) = 0
117128
118129
julia> softplus = (t) -> log(one(t) + exp(t));
119130
120131
julia> StrictlyPositiveDefinite(C = 0.1, rectifier = softplus, check_fixed_point = false)
121132
LyapunovMinimizationCondition
122-
Trains for V(x) ≥ 0.1((x - x_0)^2)
133+
Trains for V(x) ≥ 0.1((x - x_0)²)
123134
with approximation a ≤ 0 => log(1 + exp(a)) ≈ 0
124135
Does not train for V(x_0) = 0
125136
```

0 commit comments

Comments
 (0)