Skip to content

Commit bab91b3

Browse files
pitmonticoneyebaidevmotion
authored
Clean docs and docstrings (#1847)
* Update variational_inference.md * Update docs/src/for-developers/variational_inference.md Co-authored-by: David Widmann <[email protected]> * Clean docs and docstrings Fixed a few typos. Co-authored-by: Hong Ge <[email protected]> Co-authored-by: David Widmann <[email protected]>
1 parent 9951638 commit bab91b3

File tree

11 files changed

+13
-13
lines changed

11 files changed

+13
-13
lines changed

docs/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,7 @@ directory. Edits must be made to the `docs/src/` versions.
5555

5656
## MacOS Notes
5757
Under MacOS one might need to install the following additional gems's
58-
to have jekyll running as descibed above.
58+
to have jekyll running as described above.
5959

6060
```
6161
gem install jekyll-paginate

docs/src/for-developers/compiler.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -74,7 +74,7 @@ end
7474

7575
This would allow us to generate a model by calling `gauss(; x = rand(3))`.
7676

77-
If an argument has a default value `missing`, it is treated as a random variable. For variables which require an intialization because we
77+
If an argument has a default value `missing`, it is treated as a random variable. For variables which require an initialization because we
7878
need to loop or broadcast over its elements, such as `x` above, the following needs to be done:
7979
```julia
8080
if x === missing

docs/src/for-developers/how_turing_implements_abstractmcmc.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -258,7 +258,7 @@ It simply returns the density (in the discrete case, the probability) of the obs
258258

259259
## 4. Summary: Importance Sampling step by step
260260

261-
We focus on the AbstractMCMC functions that are overriden in `is.jl` and executed inside `mcmcsample`: `step!`, which is called `n_samples` times, and `sample_end!`, which is executed once after those `n_samples` iterations.
261+
We focus on the AbstractMCMC functions that are overridden in `is.jl` and executed inside `mcmcsample`: `step!`, which is called `n_samples` times, and `sample_end!`, which is executed once after those `n_samples` iterations.
262262

263263
* During the $i$-th iteration, `step!` does 3 things:
264264
* `empty!!(spl.state.vi)`: remove information about the previous sample from the sampler's `VarInfo`

docs/src/for-developers/interface.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -192,7 +192,7 @@ end
192192
```
193193

194194
The first `step!` function just packages up the initial parameterization inside the
195-
sampler, and returns it. We implicity accept the very first parameterization.
195+
sampler, and returns it. We implicitly accept the very first parameterization.
196196

197197
The other `step!` function performs the usual steps from Metropolis-Hastings. Included are
198198
several helper functions, `proposal` and `q`, which are designed to replicate the functions

docs/src/for-developers/variational_inference.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -305,7 +305,7 @@ $$
305305
\end{align*}.
306306
$$
307307

308-
And maximizing this wrt. $\mu$ and $\Sigma$ is what's referred to as **Automatic Differentation Variational Inference (ADVI)**!
308+
And maximizing this wrt. $\mu$ and $\Sigma$ is what's referred to as **Automatic Differentiation Variational Inference (ADVI)**!
309309

310310
Now if you want to try it out, [check out the tutorial on how to use ADVI in Turing.jl](../../tutorials/09-variational-inference)!
311311

docs/src/using-turing/autodiff.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ c = sample(
4242
```
4343

4444

45-
Generally, `TrackerAD` is faster when sampling from variables of high dimensionality (greater than 20) and `ForwardDiffAD` is more efficient for lower-dimension variables. This functionality allows those who are performance sensistive to fine tune their automatic differentiation for their specific models.
45+
Generally, `TrackerAD` is faster when sampling from variables of high dimensionality (greater than 20) and `ForwardDiffAD` is more efficient for lower-dimension variables. This functionality allows those who are performance sensitive to fine tune their automatic differentiation for their specific models.
4646

4747

48-
If the differentation method is not specified in this way, Turing will default to using whatever the global AD backend is. Currently, this defaults to `ForwardDiff`.
48+
If the differentiation method is not specified in this way, Turing will default to using whatever the global AD backend is. Currently, this defaults to `ForwardDiff`.

docs/src/using-turing/guide.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -540,7 +540,7 @@ chn = sample(simple_choice_f, Gibbs(HMC(0.2, 3, :p), PG(20, :z)), 1000)
540540
```
541541

542542

543-
The `Gibbs` sampler can be used to specify unique automatic differentation backends for different variable spaces. Please see the [Automatic Differentiation]({{site.baseurl}}/docs/using-turing/autodiff) article for more.
543+
The `Gibbs` sampler can be used to specify unique automatic differentiation backends for different variable spaces. Please see the [Automatic Differentiation]({{site.baseurl}}/docs/using-turing/autodiff) article for more.
544544

545545

546546
For more details of compositional sampling in Turing.jl, please check the corresponding [paper](http://proceedings.mlr.press/v84/ge18b.html).

docs/src/using-turing/performancetips.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ Generally, try to use `:forwarddiff` for models with few parameters and `:revers
4444
In case of `:tracker` and `:zygote`, it is necessary to avoid loops for now.
4545
This is mainly due to the reverse-mode AD backends `Tracker` and `Zygote` which are inefficient for such cases. `ReverseDiff` does better but vectorized operations will still perform better.
4646

47-
Avoiding loops can be done using `filldist(dist, N)` and `arraydist(dists)`. `filldist(dist, N)` creates a multivariate distribution that is composed of `N` identical and independent copies of the univariate distribution `dist` if `dist` is univariate, or it creates a matrix-variate distribution composed of `N` identical and idependent copies of the multivariate distribution `dist` if `dist` is multivariate. `filldist(dist, N, M)` can also be used to create a matrix-variate distribution from a univariate distribution `dist`. `arraydist(dists)` is similar to `filldist` but it takes an array of distributions `dists` as input. Writing a [custom distribution](advanced) with a custom adjoint is another option to avoid loops.
47+
Avoiding loops can be done using `filldist(dist, N)` and `arraydist(dists)`. `filldist(dist, N)` creates a multivariate distribution that is composed of `N` identical and independent copies of the univariate distribution `dist` if `dist` is univariate, or it creates a matrix-variate distribution composed of `N` identical and independent copies of the multivariate distribution `dist` if `dist` is multivariate. `filldist(dist, N, M)` can also be used to create a matrix-variate distribution from a univariate distribution `dist`. `arraydist(dists)` is similar to `filldist` but it takes an array of distributions `dists` as input. Writing a [custom distribution](advanced) with a custom adjoint is another option to avoid loops.
4848

4949

5050
## Ensure that types in your model can be inferred

src/essential/container.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ function TracedModel(
1616
return TracedModel{AbstractSampler,AbstractVarInfo,Model,Tuple}(model, sampler, varinfo, evaluator)
1717
end
1818

19-
# Smiliar to `evaluate!!` except that we return the evaluator signature without excutation.
19+
# Smiliar to `evaluate!!` except that we return the evaluator signature without execution.
2020
# TODO: maybe move to DynamicPPL
2121
@generated function _get_evaluator(
2222
model::Model{_F,argnames}, varinfo, context

src/inference/AdvancedSMC.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -349,7 +349,7 @@ function DynamicPPL.assume(
349349
DynamicPPL.updategid!(vi, vn, spl)
350350
r = vi[vn]
351351
end
352-
else # vn belongs to other sampler <=> conditionning on vn
352+
else # vn belongs to other sampler <=> conditioning on vn
353353
if haskey(vi, vn)
354354
r = vi[vn]
355355
else

0 commit comments

Comments
 (0)