Skip to content

Commit 9a16b75

Browse files
Merge pull request #574 from ChrisRackauckas-Claude/docs-improvements-20251229-091549
Documentation improvements: fix broken links, typos, and placeholder text
2 parents d56cf2d + 669f64f commit 9a16b75

File tree

6 files changed

+10
-10
lines changed

6 files changed

+10
-10
lines changed

docs/src/citations.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ If you are using `DataDrivenDiffEq.jl` for research, please cite
1515
}
1616
```
1717

18-
If you are using the [SymbolicRegression.jl](https://ai.damtp.cam.ac.uk/symbolicregression/dev/) API, please cite
18+
If you are using the [SymbolicRegression.jl](https://ai.damtp.cam.ac.uk/symbolicregression/stable/) API, please cite
1919

2020
```bibtex
2121
@misc{cranmerInterpretableMachineLearning2023,

docs/src/libs/datadrivendmd/example_04.jl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# # [Nonlinear Time Continuous System](@id nonlinear_continuos)
1+
# # [Nonlinear Time Continuous System](@id nonlinear_continuous)
22
#
33
# Similarly, we can use the [Extended Dynamic Mode Decomposition](https://link.springer.com/article/10.1007/s00332-015-9258-5) via a nonlinear [`Basis`](@ref) of observables. Here, we will look at a rather [famous example](https://arxiv.org/pdf/1510.03007) with a finite dimensional solution.
44

@@ -66,7 +66,7 @@ res = solve(prob, Ψ, DMDPINV(), digits = 2)
6666
# And plot the results
6767
#md plot(res)
6868

69-
#md # ## [Copy-Pasteable Code](@id linear_discrete_copy_paste)
69+
#md # ## [Copy-Pasteable Code](@id nonlinear_continuous_copy_paste)
7070
#md #
7171
#md # ```julia
7272
#md # @__CODE__

docs/src/libs/datadrivensparse/example_04.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ basis = Basis(eqs, x, independent_variable = t, implicits = D.(x))
4646

4747
#md plot(dd_prob)
4848

49-
# Next to varying over different sparsity penalties, we also want to batch our data using the **TEXT**
49+
# Next to varying over different sparsity penalties, we also want to batch our data using [`DataProcessing`](@ref).
5050

5151
sampler = DataProcessing(split = 0.8, shuffle = true, batchsize = 30)
5252
res = solve(dd_prob, basis, ImplicitOptimizer(STLSQ(1e-2:1e-2:1.0)),

docs/src/libs/datadrivensr/example_01.jl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ prob = ContinuousDataDrivenProblem(X, t, U = U)
3333

3434
#md plot(prob)
3535

36-
# To solve our problem, we will use [`EQSearch`](@ref), which provides a wrapper for the [symbolic regression interface](https://astroautomata.com/SymbolicRegression.jl/v0.6/api/#Options).
36+
# To solve our problem, we will use [`EQSearch`](@ref), which provides a wrapper for the [symbolic regression interface](https://ai.damtp.cam.ac.uk/symbolicregression/stable/api/#Options).
3737
# We will stick to simple operations, use a `L1DistLoss`, and limit the verbosity of the algorithm.
3838

3939
eqsearch_options = SymbolicRegression.Options(binary_operators = [+, *],
@@ -44,7 +44,7 @@ eqsearch_options = SymbolicRegression.Options(binary_operators = [+, *],
4444
alg = EQSearch(eq_options = eqsearch_options)
4545

4646
# Again, we `solve` the problem to obtain a [`DataDrivenSolution`](@ref). Note that any additional keyword arguments are passed onto
47-
# symbolic regressions [`EquationSearch`](https://astroautomata.com/SymbolicRegression.jl/v0.6/api/#EquationSearch) with the exception of `niterations` which
47+
# symbolic regressions [`EquationSearch`](https://ai.damtp.cam.ac.uk/symbolicregression/stable/api/#EquationSearch) with the exception of `niterations` which
4848
# is `maxiters`
4949

5050
res = solve(prob, alg, options = DataDrivenCommonOptions(maxiters = 100))

docs/src/libs/datadrivensr/example_02.jl

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -19,14 +19,14 @@ tspan = (0.0, 10.0)
1919
sys = ODEProblem{true, SciMLBase.NoSpecialize}(pendulum!, u0, tspan)
2020
sol = solve(sys, Tsit5());
2121

22-
# We will use the data provided by our problem, but add the control signal `U = sin(0.5*t)` to it. Instead of using a function, like in [another example](@ref linear_continuous_controls)
22+
# We will use the data provided by our problem.
2323
prob = DataDrivenProblem(sol)
2424

2525
# And plot the problems data.
2626

2727
#md plot(prob)
2828

29-
# To solve our problem, we will use [`EQSearch`](@ref), which provides a wrapper for the [symbolic regression interface](https://astroautomata.com/SymbolicRegression.jl/v0.6/api/#Options).
29+
# To solve our problem, we will use [`EQSearch`](@ref), which provides a wrapper for the [symbolic regression interface](https://ai.damtp.cam.ac.uk/symbolicregression/stable/api/#Options).
3030
# We will stick to simple operations, use a `L1DistLoss`, and limit the verbosity of the algorithm.
3131
# Note that we do not include `sin`, but rather lift the search space of variables.
3232

@@ -56,7 +56,7 @@ res = solve(prob, basis, alg, options = DataDrivenCommonOptions(maxiters = 100))
5656
system = get_basis(res)
5757
#md println(system) # hide
5858

59-
#md # ## [Copy-Pasteable Code](@id symbolic_regression_simple_copy_paste)
59+
#md # ## [Copy-Pasteable Code](@id symbolic_regression_lifted_copy_paste)
6060
#md #
6161
#md # ```julia
6262
#md # @__CODE__

docs/src/problems.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ DataDrivenProblem
66

77
## Defining a Problem
88

9-
Problems of identification, estimation, or inference are defined by data. These data contain at least measurements of the states `X`, which would be sufficient to describe a `[DiscreteDataDrivenProblem`](@ref) with unit time steps similar to the first example on dynamic mode decomposition. Of course, we can extend this to include time points `t`, control signals `U` or a function describing those `u(x,p,t)`. Additionally, any parameters `p` known a priori can be included in the problem. In practice, this looks like:
9+
Problems of identification, estimation, or inference are defined by data. These data contain at least measurements of the states `X`, which would be sufficient to describe a [`DiscreteDataDrivenProblem`](@ref) with unit time steps similar to the first example on dynamic mode decomposition. Of course, we can extend this to include time points `t`, control signals `U` or a function describing those `u(x,p,t)`. Additionally, any parameters `p` known a priori can be included in the problem. In practice, this looks like:
1010

1111
```julia
1212
problem = DiscreteDataDrivenProblem(X)

0 commit comments

Comments
 (0)