|
2 | 2 |
|
3 | 3 | [](https://zenodo.org/badge/latestdoi/125869131) |
4 | 4 |
|
5 | | -This repository is a collection of notebooks about *Bayesian Machine Learning*. The following links display |
6 | | -the notebooks via [nbviewer](https://nbviewer.jupyter.org/) to ensure a proper rendering of formulas. **Update:** |
7 | | -PyMC3 and PyMC4 implementations are now available for some notebooks (more planned). |
| 5 | +This repository is a collection of notebooks related to *Bayesian Machine Learning*. The following links display |
| 6 | +the notebooks via [nbviewer](https://nbviewer.jupyter.org/) to ensure a proper rendering of formulas. |
8 | 7 |
|
9 | | -- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/latent_variable_models_part_1.ipynb) |
10 | | - [Latent variable models, part 1: Gaussian mixture models and the EM algorithm](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent_variable_models_part_1.ipynb). |
11 | | - Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models. Example |
12 | | - implementation with plain NumPy/SciPy and scikit-learn for comparison (see also |
13 | | - [PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent_variable_models_part_1_pymc3.ipynb)). |
| 8 | +- [Variational inference in Bayesian neural networks](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-neural-networks/bayesian_neural_networks.ipynb). |
| 9 | + Demonstrates how to implement a Bayesian neural network and variational inference of network parameters. Example implementation |
| 10 | + with Keras (see also |
| 11 | + [PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-neural-networks/bayesian_neural_networks_pymc4.ipynb)). |
14 | 12 |
|
15 | | -- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/latent_variable_models_part_2.ipynb) |
16 | | - [Latent variable models, part 2: Stochastic variational inference and variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent_variable_models_part_2.ipynb). |
| 13 | +- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_2.ipynb) |
| 14 | + [Latent variable models, part 2: Stochastic variational inference and variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_2.ipynb). |
17 | 15 | Introduction to stochastic variational inference with variational autoencoder as application example. Implementation |
18 | 16 | with Tensorflow 2.x. |
19 | 17 |
|
20 | | -- [Variational inference in Bayesian neural networks](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian_neural_networks.ipynb). Demonstrates how to |
21 | | - implement and train a Bayesian neural network using a variational inference approach. Example implementation with Keras (see also |
22 | | - [PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/wip-bnn-pymc4/bayesian_neural_networks_pymc4.ipynb)). |
| 18 | +- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_1.ipynb) |
| 19 | + [Latent variable models, part 1: Gaussian mixture models and the EM algorithm](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_1.ipynb). |
| 20 | + Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models. Example |
| 21 | + implementation with plain NumPy/SciPy and scikit-learn for comparison (see also |
| 22 | + [PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_1_pymc3.ipynb)). |
23 | 23 |
|
24 | | -- [Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian_linear_regression.ipynb). Introduction to Bayesian |
25 | | - linear regression. Implementation from scratch with plain NumPy as well as usage of scikit-learn for comparison (see also |
26 | | - [PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian_linear_regression_pymc4.ipynb) and |
27 | | - [PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian_linear_regression_pymc3.ipynb)). |
| 24 | +- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/bayesian-optimization/bayesian_optimization.ipynb) |
| 25 | + [Bayesian optimization](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-optimization/bayesian_optimization.ipynb). |
| 26 | + Introduction to Bayesian optimization. Example implementations with plain NumPy/SciPy as well as with libraries |
| 27 | + scikit-optimize and GPyOpt. Hyper-parameter tuning as application example. |
28 | 28 |
|
29 | | -- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/gaussian_processes.ipynb) |
30 | | - [Gaussian processes](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/gaussian_processes.ipynb). |
| 29 | +- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/gaussian-processes/gaussian_processes.ipynb) |
| 30 | + [Gaussian processes](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/gaussian-processes/gaussian_processes.ipynb). |
31 | 31 | Introduction to Gaussian processes. Example implementations with plain NumPy/SciPy as well as with libraries |
32 | 32 | scikit-learn and GPy. |
33 | 33 |
|
34 | | -- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/bayesian_optimization.ipynb) |
35 | | - [Bayesian optimization](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian_optimization.ipynb). |
36 | | - Introduction to Bayesian optimization. Example implementations with plain NumPy/SciPy as well as with libraries |
37 | | - scikit-optimize and GPyOpt. Hyperparameter tuning as application example. |
| 34 | +- [Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-linear-regression/bayesian_linear_regression.ipynb). |
| 35 | + Introduction to Bayesian linear regression. Implementation from scratch with plain NumPy as well as usage of scikit-learn |
| 36 | + for comparison (see also |
| 37 | + [PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/maste/bayesian-linear-regressionr/bayesian_linear_regression_pymc4.ipynb) and |
| 38 | + [PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-linear-regression/bayesian_linear_regression_pymc3.ipynb)). |
38 | 39 |
|
39 | | -- [Deep feature consistent variational autoencoder](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/variational_autoencoder_dfc.ipynb). |
| 40 | +- [Deep feature consistent variational autoencoder](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/autoencoder-applications/variational_autoencoder_dfc.ipynb). |
40 | 41 | Describes how a perceptual loss can improve the quality of images generated by a variational autoencoder. Example |
41 | 42 | implementation with Keras. |
42 | 43 |
|
43 | | -- [Conditional generation via Bayesian optimization in latent space](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/variational_autoencoder_opt.ipynb). |
| 44 | +- [Conditional generation via Bayesian optimization in latent space](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/autoencoder-applications/variational_autoencoder_opt.ipynb). |
44 | 45 | Describes an approach for conditionally generating outputs with desired properties by doing Bayesian optimization in |
45 | | - latent space of variational autoencoders. Example application implemented with Keras and GPyOpt. |
| 46 | + latent space learned by a variational autoencoder. Example application implemented with Keras and GPyOpt. |
0 commit comments