Skip to content

Commit 4950509

Browse files
committed
Link to default dev branch
1 parent d560ba5 commit 4950509

11 files changed

+31
-31
lines changed

README.md

Lines changed: 15 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -5,42 +5,42 @@
55
This repository is a collection of notebooks related to *Bayesian Machine Learning*. The following links display
66
the notebooks via [nbviewer](https://nbviewer.jupyter.org/) to ensure a proper rendering of formulas.
77

8-
- [Variational inference in Bayesian neural networks](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-neural-networks/bayesian_neural_networks.ipynb).
8+
- [Variational inference in Bayesian neural networks](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-neural-networks/bayesian_neural_networks.ipynb).
99
Demonstrates how to implement a Bayesian neural network and variational inference of network parameters. Example implementation
1010
with Keras (see also
11-
[PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-neural-networks/bayesian_neural_networks_pymc4.ipynb)).
11+
[PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-neural-networks/bayesian_neural_networks_pymc4.ipynb)).
1212

13-
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_2.ipynb)
14-
[Latent variable models, part 2: Stochastic variational inference and variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_2.ipynb).
13+
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb)
14+
[Latent variable models, part 2: Stochastic variational inference and variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb).
1515
Introduction to stochastic variational inference with variational autoencoder as application example. Implementation
1616
with Tensorflow 2.x.
1717

18-
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_1.ipynb)
19-
[Latent variable models, part 1: Gaussian mixture models and the EM algorithm](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_1.ipynb).
18+
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1.ipynb)
19+
[Latent variable models, part 1: Gaussian mixture models and the EM algorithm](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1.ipynb).
2020
Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models. Example
2121
implementation with plain NumPy/SciPy and scikit-learn for comparison (see also
22-
[PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_1_pymc3.ipynb)).
22+
[PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1_pymc3.ipynb)).
2323

24-
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/bayesian-optimization/bayesian_optimization.ipynb)
25-
[Bayesian optimization](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-optimization/bayesian_optimization.ipynb).
24+
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb)
25+
[Bayesian optimization](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb).
2626
Introduction to Bayesian optimization. Example implementations with plain NumPy/SciPy as well as with libraries
2727
scikit-optimize and GPyOpt. Hyper-parameter tuning as application example.
2828

29-
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/gaussian-processes/gaussian_processes.ipynb)
30-
[Gaussian processes](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/gaussian-processes/gaussian_processes.ipynb).
29+
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb)
30+
[Gaussian processes](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb).
3131
Introduction to Gaussian processes. Example implementations with plain NumPy/SciPy as well as with libraries
3232
scikit-learn and GPy.
3333

34-
- [Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-linear-regression/bayesian_linear_regression.ipynb).
34+
- [Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression.ipynb).
3535
Introduction to Bayesian linear regression. Implementation from scratch with plain NumPy as well as usage of scikit-learn
3636
for comparison (see also
3737
[PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/maste/bayesian-linear-regressionr/bayesian_linear_regression_pymc4.ipynb) and
38-
[PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-linear-regression/bayesian_linear_regression_pymc3.ipynb)).
38+
[PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression_pymc3.ipynb)).
3939

40-
- [Deep feature consistent variational autoencoder](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/autoencoder-applications/variational_autoencoder_dfc.ipynb).
40+
- [Deep feature consistent variational autoencoder](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder_dfc.ipynb).
4141
Describes how a perceptual loss can improve the quality of images generated by a variational autoencoder. Example
4242
implementation with Keras.
4343

44-
- [Conditional generation via Bayesian optimization in latent space](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/autoencoder-applications/variational_autoencoder_opt.ipynb).
44+
- [Conditional generation via Bayesian optimization in latent space](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder_opt.ipynb).
4545
Describes an approach for conditionally generating outputs with desired properties by doing Bayesian optimization in
4646
latent space learned by a variational autoencoder. Example application implemented with Keras and GPyOpt.

autoencoder-applications/variational_autoencoder.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -8,10 +8,10 @@
88
"\n",
99
"**Update, Dec. 17<sup>th</sup> 2019**: This notebook is superseded by the following two notebooks:\n",
1010
"\n",
11-
"- [Latent variable models - part 1: Gaussian mixture models and the EM algorithm](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_1.ipynb)\n",
12-
"- [Latent variable models - part 2: Stochastic variational inference and variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_2.ipynb).\n",
11+
"- [Latent variable models - part 1: Gaussian mixture models and the EM algorithm](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1.ipynb)\n",
12+
"- [Latent variable models - part 2: Stochastic variational inference and variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb).\n",
1313
"\n",
14-
"The following old variational autoencoder code<sup>[1]</sup> is still used in [other](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/autoencoder-applications/variational_autoencoder_opt.ipynb) [notebooks](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/autoencoder-applications/variational_autoencoder_dfc.ipynb) and kept here for further reference."
14+
"The following old variational autoencoder code<sup>[1]</sup> is still used in [other](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder_opt.ipynb) [notebooks](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder_dfc.ipynb) and kept here for further reference."
1515
]
1616
},
1717
{

autoencoder-applications/variational_autoencoder_dfc.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@
1616
"\n",
1717
"### Plain VAE\n",
1818
"\n",
19-
"In a [previous article](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master//autoencoder-applications/variational_autoencoder.ipynb) I introduced the variational autoencoder (VAE) and how it can be trained with a variational lower bound $\\mathcal{L}$ as optimization objective using stochastic gradient ascent methods. In context of stochastic gradient descent its negative value is used as loss function $L_{vae}$ which is a sum of a reconstruction loss $L_{rec}$ and a regularization term $L_{kl}$:\n",
19+
"In a [previous article](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev//autoencoder-applications/variational_autoencoder.ipynb) I introduced the variational autoencoder (VAE) and how it can be trained with a variational lower bound $\\mathcal{L}$ as optimization objective using stochastic gradient ascent methods. In context of stochastic gradient descent its negative value is used as loss function $L_{vae}$ which is a sum of a reconstruction loss $L_{rec}$ and a regularization term $L_{kl}$:\n",
2020
"\n",
2121
"$$\n",
2222
"\\begin{align*}\n",
@@ -87,7 +87,7 @@
8787
"source": [
8888
"## Training\n",
8989
"\n",
90-
"In contrast to the original paper we will use the MNIST handwritten digits dataset for training and for demonstrating how a perceptual loss improves over a pixel-by-pixel reconstruction loss. We can therefore reuse the VAE [encoder](https://github.com/krasserm/bayesian-machine-learning/blob/master/autoencoder-applications/variational_autoencoder_opt_util.py#L15-L36) and [decoder](https://github.com/krasserm/bayesian-machine-learning/blob/master/autoencoder-applications/variational_autoencoder_opt_util.py#L38-L53) architectures from the already mentioned [previous article](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/autoencoder-applications/variational_autoencoder.ipynb). The perceptual model is a [small CNN](https://github.com/krasserm/bayesian-machine-learning/blob/master/autoencoder-applications/variational_autoencoder_opt_util.py#L91-L105) (Fig. 3) that has already been trained in [another context](http://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/autoencoder-applications/variational_autoencoder_opt.ipynb#Optimization-objectives) to classify MNIST images."
90+
"In contrast to the original paper we will use the MNIST handwritten digits dataset for training and for demonstrating how a perceptual loss improves over a pixel-by-pixel reconstruction loss. We can therefore reuse the VAE [encoder](https://github.com/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder_opt_util.py#L15-L36) and [decoder](https://github.com/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder_opt_util.py#L38-L53) architectures from the already mentioned [previous article](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder.ipynb). The perceptual model is a [small CNN](https://github.com/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder_opt_util.py#L91-L105) (Fig. 3) that has already been trained in [another context](http://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder_opt.ipynb#Optimization-objectives) to classify MNIST images."
9191
]
9292
},
9393
{

autoencoder-applications/variational_autoencoder_opt.ipynb

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@
2929
"- How can application of Bayesian optimization methods be justified?\n",
3030
"- What are possible alternatives to this approach? \n",
3131
"\n",
32-
"I'll leave experiments with the chemical compounds dataset and the public chemical VAE for another article. The following assumes some basic familiarity with [variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/autoencoder-applications/variational_autoencoder.ipynb), [Bayesian otpimization](http://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-optimization/bayesian_optimization.ipynb) and [Gaussian processes](http://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/gaussian-processes/gaussian_processes.ipynb). For more information on these topics you may want to read the linked articles."
32+
"I'll leave experiments with the chemical compounds dataset and the public chemical VAE for another article. The following assumes some basic familiarity with [variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder.ipynb), [Bayesian otpimization](http://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb) and [Gaussian processes](http://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb). For more information on these topics you may want to read the linked articles."
3333
]
3434
},
3535
{
@@ -44,13 +44,13 @@
4444
"\n",
4545
"### Encoder\n",
4646
"\n",
47-
"The encoder is a CNN, identical to the one presented in the the [variational autoencoder](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/autoencoder-applications/variational_autoencoder.ipynb) notebook.\n",
47+
"The encoder is a CNN, identical to the one presented in the the [variational autoencoder](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder.ipynb) notebook.\n",
4848
"\n",
4949
"![encoder](images/vae-opt/encoder.png) \n",
5050
"\n",
5151
"### Decoder\n",
5252
"\n",
53-
"The decoder is a CNN, identical to the one presented in the the [variational autoencoder](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/autoencoder-applications/variational_autoencoder.ipynb) notebook.\n",
53+
"The decoder is a CNN, identical to the one presented in the the [variational autoencoder](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder.ipynb) notebook.\n",
5454
"\n",
5555
"![decoder](images/vae-opt/decoder.png)\n",
5656
"\n",
@@ -90,7 +90,7 @@
9090
"cell_type": "markdown",
9191
"metadata": {},
9292
"source": [
93-
"Code for the encoder and decoder have already been presented [elsewhere](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/autoencoder-applications/variational_autoencoder.ipynb), so only code for the predictor is shown here (see [variational_autoencoder_opt_util.py](variational_autoencoder_opt_util.py) for other function definitions):"
93+
"Code for the encoder and decoder have already been presented [elsewhere](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder.ipynb), so only code for the predictor is shown here (see [variational_autoencoder_opt_util.py](variational_autoencoder_opt_util.py) for other function definitions):"
9494
]
9595
},
9696
{

bayesian-linear-regression/bayesian_linear_regression_pymc3.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@
3232
"cell_type": "markdown",
3333
"metadata": {},
3434
"source": [
35-
"This is a [PyMC3](https://docs.pymc.io/) implementation of the examples in [Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-linear-regression/bayesian_linear_regression.ipynb). To recap, a linear regression model is a linear function of the parameters but not necessarily of the input. Input $x$ can be expanded with a set of non-linear basis functions $\\phi_j(x)$, where $(\\phi_1(x), \\dots, \\phi_M(x))^T = \\boldsymbol\\phi(x)$, for modeling a non-linear relationship between input $x$ and a function value $y$.\n",
35+
"This is a [PyMC3](https://docs.pymc.io/) implementation of the examples in [Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression.ipynb). To recap, a linear regression model is a linear function of the parameters but not necessarily of the input. Input $x$ can be expanded with a set of non-linear basis functions $\\phi_j(x)$, where $(\\phi_1(x), \\dots, \\phi_M(x))^T = \\boldsymbol\\phi(x)$, for modeling a non-linear relationship between input $x$ and a function value $y$.\n",
3636
"\n",
3737
"$$\n",
3838
"y(x, \\mathbf{w}) = w_0 + \\sum_{j=1}^{M}{w_j \\phi_j(x)} = w_0 + \\mathbf{w}_{1:}^T \\boldsymbol\\phi(x) \\tag{1}\n",

bayesian-linear-regression/bayesian_linear_regression_pymc4.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@
4545
"source": [
4646
"## Linear basis function models\n",
4747
"\n",
48-
"The following is a PyMC4 implementation of [Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-linear-regression/bayesian_linear_regression.ipynb). To recap, a linear regression model is a linear function of the parameters but not necessarily of the input. Input $x$ can be expanded with a set of non-linear basis functions $\\phi_j(x)$, where $(\\phi_1(x), \\dots, \\phi_M(x))^T = \\boldsymbol\\phi(x)$, for modeling a non-linear relationship between input $x$ and a function value $y$.\n",
48+
"The following is a PyMC4 implementation of [Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression.ipynb). To recap, a linear regression model is a linear function of the parameters but not necessarily of the input. Input $x$ can be expanded with a set of non-linear basis functions $\\phi_j(x)$, where $(\\phi_1(x), \\dots, \\phi_M(x))^T = \\boldsymbol\\phi(x)$, for modeling a non-linear relationship between input $x$ and a function value $y$.\n",
4949
"\n",
5050
"$$\n",
5151
"y(x, \\mathbf{w}) = w_0 + \\sum_{j=1}^{M}{w_j \\phi_j(x)} = w_0 + \\mathbf{w}_{1:}^T \\boldsymbol\\phi(x) \\tag{1}\n",

0 commit comments

Comments
 (0)