You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -46,11 +46,11 @@ The snippets below develop the polynomial regression example from our paper's *O
46
46
47
47
### Vectorizing Generative Functions with vmap
48
48
49
-
We begin by expressing the quadratic regression model as a composition of generative functions (`@gen`-decorated Python functions).
49
+
We begin by expressing the polynomial regression model as a composition of generative functions (`@gen`-decorated Python functions).
50
50
51
-
Each random choice (invocation of a generative function) is tagged with a string address (`"a"`, `"b"`, `"c"`, `"obs"`), which is used to construct a structured representation of the model’s latent variables and observed data, called a _trace_.
51
+
Each random choice (invocation of a generative function) is tagged with a string address (`"a"`, `"b"`, `"c"`, `"obs"`), which is used to construct a structured representation of the model’s random variables, called a _trace_.
52
52
53
-
Packaging the coefficients inside a callable `Lambda` Pytree mirrors the notion of sampling a function-valued random variable: downstream computations can call the curve directly while the trace retains access to its parameters.
53
+
In GenJAX, packaging the coefficients inside a callable `Lambda` Pytree is a convenient way to allow downstream computations to call the curve directly, while the trace retains access to its parameters.
Vectorizing the `point` generative function with `vmap` mirrors the Overview section's Figure 3: the resulting trace preserves the hierarchical structure of the coefficients while lifting the observation random choice into a vectorized arrayvalued version. This "structure preserving” vectorization is what later enables us to reason about datasets and other inference logic in a vectorized fashion.
101
+
Vectorizing the `point` generative function with `vmap` mirrors the Overview section's Figure 3: the resulting trace preserves the hierarchical structure of the coefficients of the polynomial while lifting the observation random choice into a vectorized array-valued version. This structure preserving vectorization is what later enables us to reason about datasets consisting of many points (and other inference logic) in a vectorized fashion.
102
102
103
103
### Vectorized Programmable Inference
104
104
105
-
The generative function interface supplies a small set of methods—`simulate`, `generate`, `assess`, `update`—that we can compose into inference algorithms.
105
+
The generative function interface supplies a small set of methods - `simulate`, `generate`, `assess`, `update` - that we can compose into inference algorithms.
106
106
107
-
Here we implement likelihood weighting (importance sampling): a single-particle routine constrains the observation site via the `generate` interface, while a vectorized wrapper increases the number of particles. The logic of guessing (sampling) and checking (computing an importance weight) -- internally implemented in `generate` -- remains the same across particles, only the array dimensions vary with the particle count.
107
+
Here, we implement likelihood weighting (importance sampling): a single-particle routine constrains the observation site given a fixed value via the `generate` interface, while a vectorized wrapper increases the number of particles. The logic of guessing (sampling) and checking (computing an importance weight) -- internally implemented in `generate` -- remains the same across particles, only the array dimensions vary with the particle count.
0 commit comments