Skip to content

Commit 3fc2ed8

Browse files
committed
se different YAML meta data format
1 parent 85aa532 commit 3fc2ed8

16 files changed

+178
-258
lines changed

README.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -91,9 +91,8 @@ runner.live_info()
9191
runner.live_plot()
9292
```
9393

94-
```{raw} html
9594
<img src="https://user-images.githubusercontent.com/6897215/38739170-6ac7c014-3f34-11e8-9e8f-93b3a3a3d61b.gif" width='20%'> </img> <img src="https://user-images.githubusercontent.com/6897215/35219611-ac8b2122-ff73-11e7-9332-adffab64a8ce.gif" width='40%'> </img> <img src="https://user-images.githubusercontent.com/6897215/47256441-d6d53700-d480-11e8-8224-d1cc49dbdcf5.gif" width='20%'> </img>
96-
```
95+
9796

9897
% not-in-documentation-end
9998

docs/source/algorithms_and_examples.md

Lines changed: 10 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -42,10 +42,8 @@ In addition to the learners, `adaptive` also provides primitives for running the
4242
Here are some examples of how Adaptive samples vs. homogeneous sampling.
4343
Click on the *Play* {fa}`play` button or move the sliders.
4444

45-
```{code-cell}
46-
---
47-
tags: [hide-cell]
48-
---
45+
```{code-cell} ipython3
46+
:tags: [hide-cell]
4947
5048
import itertools
5149
import adaptive
@@ -83,10 +81,8 @@ runner.live_info() # shows a widget with status information
8381
runner.live_plot()
8482
```
8583

86-
```{code-cell}
87-
---
88-
tags: [hide-cell]
89-
---
84+
```{code-cell} ipython3
85+
:tags: [hide-cell]
9086
9187
9288
def f(x, offset=0.07357338543088588):
@@ -124,10 +120,8 @@ layout.opts(plot=dict(toolbar=None))
124120

125121
## {class}`adaptive.Learner2D`
126122

127-
```{code-cell}
128-
---
129-
tags: [hide-cell]
130-
---
123+
```{code-cell} ipython3
124+
:tags: [hide-cell]
131125
132126
133127
def ring(xy):
@@ -159,10 +153,8 @@ hv.HoloMap(plots, kdims=["npoints"]).collate()
159153

160154
## {class}`adaptive.AverageLearner`
161155

162-
```{code-cell}
163-
---
164-
tags: [hide-cell]
165-
---
156+
```{code-cell} ipython3
157+
:tags: [hide-cell]
166158
167159
168160
def g(n):
@@ -187,10 +179,8 @@ hv.HoloMap(plots, kdims=["npoints"])
187179

188180
## {class}`adaptive.LearnerND`
189181

190-
```{code-cell}
191-
---
192-
tags: [hide-cell]
193-
---
182+
```{code-cell} ipython3
183+
:tags: [hide-cell]
194184
195185
196186
def sphere(xyz):

docs/source/conf.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -74,6 +74,7 @@
7474
# myst-nb configuration
7575
nb_execution_mode = "cache"
7676
nb_execution_timeout = 180
77+
execution_fail_on_error = True
7778

7879

7980
def setup(app):

docs/source/logo.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ jupytext:
1010
jupytext_version: 1.13.8
1111
---
1212

13-
```{code-cell}
13+
```{code-cell} ipython3
1414
---
1515
tags: [remove-input]
1616
---

docs/source/tutorial/tutorial.AverageLearner.md

Lines changed: 8 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -16,10 +16,8 @@ Because this documentation consists of static html, the `live_plot` and `live_in
1616
Download the notebook in order to see the real behaviour. [^download]
1717
```
1818

19-
```{code-cell}
20-
---
21-
tags: [hide-cell]
22-
---
19+
```{code-cell} ipython3
20+
:tags: [hide-cell]
2321
2422
import adaptive
2523
@@ -31,7 +29,7 @@ The next type of learner averages a function until the uncertainty in the averag
3129
This is useful for sampling a random variable.
3230
The function passed to the learner must formally take a single parameter, which should be used like a “seed” for the (pseudo-) random variable (although in the current implementation the seed parameter can be ignored by the function).
3331

34-
```{code-cell}
32+
```{code-cell} ipython3
3533
def g(n):
3634
import random
3735
from time import sleep
@@ -45,25 +43,23 @@ def g(n):
4543
return val
4644
```
4745

48-
```{code-cell}
46+
```{code-cell} ipython3
4947
learner = adaptive.AverageLearner(g, atol=None, rtol=0.01)
5048
# `loss < 1` means that we reached the `rtol` or `atol`
5149
runner = adaptive.Runner(learner, goal=lambda l: l.loss() < 1)
5250
```
5351

54-
```{code-cell}
55-
---
56-
tags: [hide-cell]
57-
---
52+
```{code-cell} ipython3
53+
:tags: [hide-cell]
5854
5955
await runner.task # This is not needed in a notebook environment!
6056
```
6157

62-
```{code-cell}
58+
```{code-cell} ipython3
6359
runner.live_info()
6460
```
6561

66-
```{code-cell}
62+
```{code-cell} ipython3
6763
runner.live_plot(update_interval=0.1)
6864
```
6965

docs/source/tutorial/tutorial.AverageLearner1D.md

Lines changed: 18 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -16,10 +16,8 @@ Because this documentation consists of static html, the `live_plot` and `live_in
1616
Download the notebook in order to see the real behaviour. [^download]
1717
```
1818

19-
```{code-cell}
20-
---
21-
tags: [hide-cell]
22-
---
19+
```{code-cell} ipython3
20+
:tags: [hide-cell]
2321
2422
import adaptive
2523
@@ -34,7 +32,7 @@ from functools import partial
3432

3533
First, we define the (noisy) function to be sampled. Note that the parameter `sigma` corresponds to the standard deviation of the Gaussian noise.
3634

37-
```{code-cell}
35+
```{code-cell} ipython3
3836
def noisy_peak(seed_x, sigma=0, peak_width=0.05, offset=-0.5):
3937
seed, x = seed_x # tuple with seed and `x` value
4038
y = x**3 - x + 3 * peak_width**2 / (peak_width**2 + (x - offset) ** 2)
@@ -45,15 +43,15 @@ def noisy_peak(seed_x, sigma=0, peak_width=0.05, offset=-0.5):
4543

4644
This is how the function looks in the absence of noise:
4745

48-
```{code-cell}
46+
```{code-cell} ipython3
4947
xs = np.linspace(-2, 2, 500)
5048
ys = [noisy_peak((seed, x), sigma=0) for seed, x in enumerate(xs)]
5149
hv.Path((xs, ys))
5250
```
5351

5452
And an example of a single realization of the noisy function:
5553

56-
```{code-cell}
54+
```{code-cell} ipython3
5755
ys = [noisy_peak((seed, x), sigma=1) for seed, x in enumerate(xs)]
5856
hv.Path((xs, ys))
5957
```
@@ -63,14 +61,14 @@ The learner will autonomously determine whether the next samples should be taken
6361

6462
We start by initializing a 1D average learner:
6563

66-
```{code-cell}
64+
```{code-cell} ipython3
6765
learner = adaptive.AverageLearner1D(partial(noisy_peak, sigma=1), bounds=(-2, 2))
6866
```
6967

7068
As with other types of learners, we need to initialize a runner with a certain goal to run our learner.
7169
In this case, we set 10000 samples as the goal (the second condition ensures that we have at least 20 samples at each point):
7270

73-
```{code-cell}
71+
```{code-cell} ipython3
7472
def goal(nsamples):
7573
def _goal(learner):
7674
return learner.nsamples >= nsamples and learner.min_samples_per_point >= 20
@@ -81,15 +79,13 @@ def goal(nsamples):
8179
runner = adaptive.Runner(learner, goal=goal(10_000))
8280
```
8381

84-
```{code-cell}
85-
---
86-
tags: [hide-cell]
87-
---
82+
```{code-cell} ipython3
83+
:tags: [hide-cell]
8884
8985
await runner.task # This is not needed in a notebook environment!
9086
```
9187

92-
```{code-cell}
88+
```{code-cell} ipython3
9389
runner.live_info()
9490
runner.live_plot(update_interval=0.1)
9591
```
@@ -110,43 +106,39 @@ The most relevant are:
110106
As an example, assume that we wanted to resample the points from the previous learner.
111107
We can decrease `delta` to 0.1 and set `min_error` to 0.05 if we do not require accuracy beyond this value:
112108

113-
```{code-cell}
109+
```{code-cell} ipython3
114110
learner.delta = 0.1
115111
learner.min_error = 0.05
116112
runner = adaptive.Runner(learner, goal=goal(20_000))
117113
```
118114

119-
```{code-cell}
120-
---
121-
tags: [hide-cell]
122-
---
115+
```{code-cell} ipython3
116+
:tags: [hide-cell]
123117
124118
await runner.task # This is not needed in a notebook environment!
125119
```
126120

127-
```{code-cell}
121+
```{code-cell} ipython3
128122
runner.live_info()
129123
runner.live_plot(update_interval=0.1)
130124
```
131125

132126
On the contrary, if we want to push forward the "exploration", we can set a larger `delta` and limit the maximum number of samples taken at each point:
133127

134-
```{code-cell}
128+
```{code-cell} ipython3
135129
learner.delta = 0.3
136130
learner.max_samples = 1000
137131
138132
runner = adaptive.Runner(learner, goal=goal(25_000))
139133
```
140134

141-
```{code-cell}
142-
---
143-
tags: [hide-cell]
144-
---
135+
```{code-cell} ipython3
136+
:tags: [hide-cell]
145137
146138
await runner.task # This is not needed in a notebook environment!
147139
```
148140

149-
```{code-cell}
141+
```{code-cell} ipython3
150142
runner.live_info()
151143
runner.live_plot(update_interval=0.1)
152144
```

docs/source/tutorial/tutorial.BalancingLearner.md

Lines changed: 8 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -16,10 +16,8 @@ Because this documentation consists of static html, the `live_plot` and `live_in
1616
Download the notebook in order to see the real behaviour. [^download]
1717
```
1818

19-
```{code-cell}
20-
---
21-
tags: [hide-cell]
22-
---
19+
```{code-cell} ipython3
20+
:tags: [hide-cell]
2321
2422
import adaptive
2523
@@ -36,7 +34,7 @@ When you request a point from the balancing learner, it will query all of its
3634

3735
The balancing learner can for example be used to implement a poor-man’s 2D learner by using the {class}`~adaptive.Learner1D`.
3836

39-
```{code-cell}
37+
```{code-cell} ipython3
4038
def h(x, offset=0):
4139
a = 0.01
4240
return x + a**2 / (a**2 + (x - offset) ** 2)
@@ -51,19 +49,17 @@ bal_learner = adaptive.BalancingLearner(learners)
5149
runner = adaptive.Runner(bal_learner, goal=lambda l: l.loss() < 0.01)
5250
```
5351

54-
```{code-cell}
55-
---
56-
tags: [hide-cell]
57-
---
52+
```{code-cell} ipython3
53+
:tags: [hide-cell]
5854
5955
await runner.task # This is not needed in a notebook environment!
6056
```
6157

62-
```{code-cell}
58+
```{code-cell} ipython3
6359
runner.live_info()
6460
```
6561

66-
```{code-cell}
62+
```{code-cell} ipython3
6763
plotter = lambda learner: hv.Overlay([L.plot() for L in learner.learners])
6864
runner.live_plot(plotter=plotter, update_interval=0.1)
6965
```
@@ -72,7 +68,7 @@ Often one wants to create a set of `learner`s for a cartesian product of paramet
7268
For that particular case we’ve added a `classmethod` called {class}`~adaptive.BalancingLearner.from_product`.
7369
See how it works below
7470

75-
```{code-cell}
71+
```{code-cell} ipython3
7672
from scipy.special import eval_jacobi
7773
7874

docs/source/tutorial/tutorial.DataSaver.md

Lines changed: 9 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -16,10 +16,8 @@ Because this documentation consists of static html, the `live_plot` and `live_in
1616
Download the notebook in order to see the real behaviour. [^download]
1717
```
1818

19-
```{code-cell}
20-
---
21-
tags: [hide-cell]
22-
---
19+
```{code-cell} ipython3
20+
:tags: [hide-cell]
2321
2422
import adaptive
2523
@@ -30,7 +28,7 @@ If the function that you want to learn returns a value along with some metadata,
3028

3129
In the following example the function to be learned returns its result and the execution time in a dictionary:
3230

33-
```{code-cell}
31+
```{code-cell} ipython3
3432
from operator import itemgetter
3533
3634
@@ -56,29 +54,27 @@ learner = adaptive.DataSaver(_learner, arg_picker=itemgetter("y"))
5654

5755
`learner.learner` is the original learner, so `learner.learner.loss()` will call the correct loss method.
5856

59-
```{code-cell}
57+
```{code-cell} ipython3
6058
runner = adaptive.Runner(learner, goal=lambda l: l.learner.loss() < 0.1)
6159
```
6260

63-
```{code-cell}
64-
---
65-
tags: [hide-cell]
66-
---
61+
```{code-cell} ipython3
62+
:tags: [hide-cell]
6763
6864
await runner.task # This is not needed in a notebook environment!
6965
```
7066

71-
```{code-cell}
67+
```{code-cell} ipython3
7268
runner.live_info()
7369
```
7470

75-
```{code-cell}
71+
```{code-cell} ipython3
7672
runner.live_plot(plotter=lambda l: l.learner.plot(), update_interval=0.1)
7773
```
7874

7975
Now the `DataSavingLearner` will have an dictionary attribute `extra_data` that has `x` as key and the data that was returned by `learner.function` as values.
8076

81-
```{code-cell}
77+
```{code-cell} ipython3
8278
learner.extra_data
8379
```
8480

0 commit comments

Comments
 (0)