Skip to content

Commit 2f86026

Browse files
authored
Update docs
1 parent cdbc260 commit 2f86026

File tree

1 file changed

+4
-8
lines changed

1 file changed

+4
-8
lines changed

doc/source/examples.rst

Lines changed: 4 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -71,17 +71,13 @@ other implemented integrators.
7171
_ = sampler.run_integration(10)
7272
7373
# Now we can use sampler to generate random numbers
74-
rnds, _, px = sampler.generate_random_array(100)
74+
rnds, px = sampler.generate_random_array(100)
7575
7676
The first object returned by ``generate_random_array`` are the random points,
7777
in the case in the example an array of shape ``(100, 10)``, i.e., the first axis
7878
is the number of requested events and the second axis the number of dimensions.
7979

80-
The second object, ignored in this example, is whatever information the algorithm
81-
need to train. Since we are just generating random numbers and not training anymore
82-
we can ignore that.
83-
84-
Finally, ``generate_random_array`` returns also the probability distribution
80+
Then ``generate_random_array`` returns also the probability distribution
8581
of the random points (i.e., the weight they carry).
8682

8783
For convenience we include sampler wrappers which directly return a trained
@@ -92,7 +88,7 @@ reference to the ``generate_random_array`` method:
9288
from vegasflow import vegas_sampler
9389
9490
sampler = vegas_sampler(my_complicated_fun, n_dim, n_events)
95-
rnds, _, px = sampler(100)
91+
rnds, px = sampler(100)
9692
9793
9894
It is possible to change the number of training steps (default 5) or to retrieve
@@ -102,7 +98,7 @@ arguments.
10298
.. code-block:: python
10399
104100
sampler_class = vegas_sampler(my_complicated_fun, n_dim, n_events, training_steps=1, return_class=True)
105-
rnds, _, px = sampler_class.generate_random_array(100)
101+
rnds, px = sampler_class.generate_random_array(100)
106102
107103
Integrating a numpy function
108104
============================

0 commit comments

Comments
 (0)