Skip to content

Commit 2cf8bb8

Browse files
author
Shravan Goswami
committed
fixed Pkg command in all docs
1 parent 110a8b3 commit 2cf8bb8

File tree

29 files changed

+29
-29
lines changed

29 files changed

+29
-29
lines changed

tutorials/00-introduction/index.qmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ engine: julia
77
#| echo: false
88
#| output: false
99
using Pkg;
10-
Pkg.instantiate(".");
10+
Pkg.instantiate();
1111
```
1212

1313
### Introduction

tutorials/01-gaussian-mixture-model/index.qmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ engine: julia
77
#| echo: false
88
#| output: false
99
using Pkg;
10-
Pkg.instantiate(".");
10+
Pkg.instantiate();
1111
```
1212

1313
The following tutorial illustrates the use of Turing for clustering data using a Bayesian mixture model.

tutorials/02-logistic-regression/index.qmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ engine: julia
77
#| echo: false
88
#| output: false
99
using Pkg;
10-
Pkg.instantiate(".");
10+
Pkg.instantiate();
1111
```
1212

1313
[Bayesian logistic regression](https://en.wikipedia.org/wiki/Logistic_regression#Bayesian) is the Bayesian counterpart to a common tool in machine learning, logistic regression.

tutorials/03-bayesian-neural-network/index.qmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ engine: julia
77
#| echo: false
88
#| output: false
99
using Pkg;
10-
Pkg.instantiate(".");
10+
Pkg.instantiate();
1111
```
1212

1313
In this tutorial, we demonstrate how one can implement a Bayesian Neural Network using a combination of Turing and [Flux](https://github.com/FluxML/Flux.jl), a suite of machine learning tools. We will use Flux to specify the neural network's layers and Turing to implement the probabilistic inference, with the goal of implementing a classification algorithm.

tutorials/04-hidden-markov-model/index.qmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ engine: julia
77
#| echo: false
88
#| output: false
99
using Pkg;
10-
Pkg.instantiate(".");
10+
Pkg.instantiate();
1111
```
1212

1313
This tutorial illustrates training Bayesian [Hidden Markov Models](https://en.wikipedia.org/wiki/Hidden_Markov_model) (HMM) using Turing. The main goals are learning the transition matrix, emission parameter, and hidden states. For a more rigorous academic overview on Hidden Markov Models, see [An introduction to Hidden Markov Models and Bayesian Networks](http://mlg.eng.cam.ac.uk/zoubin/papers/ijprai.pdf) (Ghahramani, 2001).

tutorials/05-linear-regression/index.qmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ engine: julia
77
#| echo: false
88
#| output: false
99
using Pkg;
10-
Pkg.instantiate(".");
10+
Pkg.instantiate();
1111
```
1212

1313
Turing is powerful when applied to complex hierarchical models, but it can also be put to task at common statistical procedures, like [linear regression](https://en.wikipedia.org/wiki/Linear_regression).

tutorials/06-infinite-mixture-model/index.qmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ engine: julia
77
#| echo: false
88
#| output: false
99
using Pkg;
10-
Pkg.instantiate(".");
10+
Pkg.instantiate();
1111
```
1212

1313
In many applications it is desirable to allow the model to adjust its complexity to the amount of data. Consider for example the task of assigning objects into clusters or groups. This task often involves the specification of the number of groups. However, often times it is not known beforehand how many groups exist. Moreover, in some applictions, e.g. modelling topics in text documents or grouping species, the number of examples per group is heavy tailed. This makes it impossible to predefine the number of groups and requiring the model to form new groups when data points from previously unseen groups are observed.

tutorials/07-poisson-regression/index.qmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ engine: julia
77
#| echo: false
88
#| output: false
99
using Pkg;
10-
Pkg.instantiate(".");
10+
Pkg.instantiate();
1111
```
1212

1313
This notebook is ported from the [example notebook](https://www.pymc.io/projects/examples/en/latest/generalized_linear_models/GLM-poisson-regression.html) of PyMC3 on Poisson Regression.

tutorials/08-multinomial-logistic-regression/index.qmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ engine: julia
77
#| echo: false
88
#| output: false
99
using Pkg;
10-
Pkg.instantiate(".");
10+
Pkg.instantiate();
1111
```
1212

1313
[Multinomial logistic regression](https://en.wikipedia.org/wiki/Multinomial_logistic_regression) is an extension of logistic regression. Logistic regression is used to model problems in which there are exactly two possible discrete outcomes. Multinomial logistic regression is used to model problems in which there are two or more possible discrete outcomes.

tutorials/09-variational-inference/index.qmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ engine: julia
77
#| echo: false
88
#| output: false
99
using Pkg;
10-
Pkg.instantiate(".");
10+
Pkg.instantiate();
1111
```
1212

1313
In this post we'll have a look at what's know as **variational inference (VI)**, a family of _approximate_ Bayesian inference methods, and how to use it in Turing.jl as an alternative to other approaches such as MCMC. In particular, we will focus on one of the more standard VI methods called **Automatic Differentation Variational Inference (ADVI)**.

0 commit comments

Comments
 (0)