Skip to content

Commit 010ed38

Browse files
committed
Fix links in notebooks
1 parent d966da4 commit 010ed38

File tree

2 files changed

+5
-8
lines changed

2 files changed

+5
-8
lines changed

docs/source/algorithms_and_examples.md

Lines changed: 3 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -84,7 +84,6 @@ runner.live_plot()
8484
```{code-cell} ipython3
8585
:tags: [hide-cell]
8686
87-
8887
def f(x, offset=0.07357338543088588):
8988
a = 0.01
9089
return x + a**2 / (a**2 + (x - offset) ** 2)
@@ -110,11 +109,9 @@ def get_hm(loss_per_interval, N=101):
110109
plots = {n: plot(learner, n) for n in range(N)}
111110
return hv.HoloMap(plots, kdims=["npoints"])
112111
113-
114-
layout = get_hm(uniform_loss).relabel("homogeneous samping") + get_hm(
115-
default_loss
116-
).relabel("with adaptive")
117-
112+
plot_homo = get_hm(uniform_loss).relabel("homogeneous samping")
113+
plot_adaptive = get_hm(default_loss).relabel("with adaptive")
114+
layout = plot_homo + plot_adaptive
118115
layout.opts(plot=dict(toolbar=None))
119116
```
120117

docs/source/tutorial/tutorial.custom_loss.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ from functools import partial
2929
import holoviews as hv
3030
```
3131

32-
`~adaptive.Learner1D` and {class}`~adaptive.Learner2D` both work on the principle of subdividing their domain into subdomains, and assigning a property to each subdomain, which we call the *loss*.
32+
{class}`~adaptive.Learner1D` and {class}`~adaptive.Learner2D` both work on the principle of subdividing their domain into subdomains, and assigning a property to each subdomain, which we call the *loss*.
3333
The algorithm for choosing the best place to evaluate our function is then simply *take the subdomain with the largest loss and add a point in the center, creating new subdomains around this point*.
3434

3535
The *loss function* that defines the loss per subdomain is the canonical place to define what regions of the domain are “interesting”.
@@ -164,4 +164,4 @@ Awesome! We zoom in on the singularity, but not at the expense of sampling the r
164164

165165
The above strategy is available as {class}`adaptive.learner.learner2D.resolution_loss_function`.
166166

167-
[^download]: This notebook can be downloaded as **{nb-download}`tutorial.custom-loss.ipynb`** and {download}`tutorial.custom-loss.md`.
167+
[^download]: This notebook can be downloaded as **{nb-download}`tutorial.custom_loss.ipynb`** and {download}`tutorial.custom_loss.md`.

0 commit comments

Comments
 (0)