|
5 | 5 | This repository is a collection of notebooks related to *Bayesian Machine Learning*. The following links display |
6 | 6 | the notebooks via [nbviewer](https://nbviewer.jupyter.org/) to ensure a proper rendering of formulas. |
7 | 7 |
|
8 | | -- [Variational inference in Bayesian neural networks](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-neural-networks/bayesian_neural_networks.ipynb). |
| 8 | +- [Variational inference in Bayesian neural networks](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-neural-networks/bayesian_neural_networks.ipynb). |
9 | 9 | Demonstrates how to implement a Bayesian neural network and variational inference of network parameters. Example implementation |
10 | 10 | with Keras (see also |
11 | | - [PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-neural-networks/bayesian_neural_networks_pymc4.ipynb)). |
| 11 | + [PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-neural-networks/bayesian_neural_networks_pymc4.ipynb)). |
12 | 12 |
|
13 | | -- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_2.ipynb) |
14 | | - [Latent variable models, part 2: Stochastic variational inference and variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_2.ipynb). |
| 13 | +- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb) |
| 14 | + [Latent variable models, part 2: Stochastic variational inference and variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb). |
15 | 15 | Introduction to stochastic variational inference with variational autoencoder as application example. Implementation |
16 | 16 | with Tensorflow 2.x. |
17 | 17 |
|
18 | | -- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_1.ipynb) |
19 | | - [Latent variable models, part 1: Gaussian mixture models and the EM algorithm](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_1.ipynb). |
| 18 | +- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1.ipynb) |
| 19 | + [Latent variable models, part 1: Gaussian mixture models and the EM algorithm](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1.ipynb). |
20 | 20 | Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models. Example |
21 | 21 | implementation with plain NumPy/SciPy and scikit-learn for comparison (see also |
22 | | - [PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_1_pymc3.ipynb)). |
| 22 | + [PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1_pymc3.ipynb)). |
23 | 23 |
|
24 | | -- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/bayesian-optimization/bayesian_optimization.ipynb) |
25 | | - [Bayesian optimization](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-optimization/bayesian_optimization.ipynb). |
| 24 | +- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb) |
| 25 | + [Bayesian optimization](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb). |
26 | 26 | Introduction to Bayesian optimization. Example implementations with plain NumPy/SciPy as well as with libraries |
27 | 27 | scikit-optimize and GPyOpt. Hyper-parameter tuning as application example. |
28 | 28 |
|
29 | | -- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/gaussian-processes/gaussian_processes.ipynb) |
30 | | - [Gaussian processes](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/gaussian-processes/gaussian_processes.ipynb). |
| 29 | +- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb) |
| 30 | + [Gaussian processes](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb). |
31 | 31 | Introduction to Gaussian processes. Example implementations with plain NumPy/SciPy as well as with libraries |
32 | 32 | scikit-learn and GPy. |
33 | 33 |
|
34 | | -- [Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-linear-regression/bayesian_linear_regression.ipynb). |
| 34 | +- [Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression.ipynb). |
35 | 35 | Introduction to Bayesian linear regression. Implementation from scratch with plain NumPy as well as usage of scikit-learn |
36 | 36 | for comparison (see also |
37 | 37 | [PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/maste/bayesian-linear-regressionr/bayesian_linear_regression_pymc4.ipynb) and |
38 | | - [PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-linear-regression/bayesian_linear_regression_pymc3.ipynb)). |
| 38 | + [PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression_pymc3.ipynb)). |
39 | 39 |
|
40 | | -- [Deep feature consistent variational autoencoder](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/autoencoder-applications/variational_autoencoder_dfc.ipynb). |
| 40 | +- [Deep feature consistent variational autoencoder](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder_dfc.ipynb). |
41 | 41 | Describes how a perceptual loss can improve the quality of images generated by a variational autoencoder. Example |
42 | 42 | implementation with Keras. |
43 | 43 |
|
44 | | -- [Conditional generation via Bayesian optimization in latent space](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/autoencoder-applications/variational_autoencoder_opt.ipynb). |
| 44 | +- [Conditional generation via Bayesian optimization in latent space](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder_opt.ipynb). |
45 | 45 | Describes an approach for conditionally generating outputs with desired properties by doing Bayesian optimization in |
46 | 46 | latent space learned by a variational autoencoder. Example application implemented with Keras and GPyOpt. |
0 commit comments