Skip to content

Commit d966da4

Browse files
committed
Lower loss requirement
1 parent 3fc2ed8 commit d966da4

File tree

1 file changed

+4
-2
lines changed

1 file changed

+4
-2
lines changed

docs/source/tutorial/tutorial.custom_loss.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -99,7 +99,7 @@ learner = adaptive.Learner2D(
9999
)
100100
101101
# this takes a while, so use the async Runner so we know *something* is happening
102-
runner = adaptive.Runner(learner, goal=lambda l: l.loss() < 0.02)
102+
runner = adaptive.Runner(learner, goal=lambda l: l.loss() < 0.03 or l.npoints > 1000)
103103
```
104104

105105
```{code-cell} ipython3
@@ -155,7 +155,9 @@ loss = resolution_loss_function(min_distance=0.01)
155155
156156
learner = adaptive.Learner2D(f_divergent_2d, [(-1, 1), (-1, 1)], loss_per_triangle=loss)
157157
runner = adaptive.BlockingRunner(learner, goal=lambda l: l.loss() < 0.02)
158-
learner.plot(tri_alpha=0.3).relabel("1 / (x^2 + y^2) in log scale").opts(hv.opts.EdgePaths(color='w'), hv.opts.Image(logz=True, colorbar=True))
158+
learner.plot(tri_alpha=0.3).relabel("1 / (x^2 + y^2) in log scale").opts(
159+
hv.opts.EdgePaths(color="w"), hv.opts.Image(logz=True, colorbar=True)
160+
)
159161
```
160162

161163
Awesome! We zoom in on the singularity, but not at the expense of sampling the rest of the domain a reasonable amount.

0 commit comments

Comments
 (0)