Skip to content

Commit d560ba5

Browse files
committed
Move notebooks to new location
- update Github links - update nbviewer links
1 parent a9973df commit d560ba5

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

43 files changed

+47
-829
lines changed

README.md

Lines changed: 27 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -2,44 +2,45 @@
22

33
[![DOI](https://zenodo.org/badge/125869131.svg)](https://zenodo.org/badge/latestdoi/125869131)
44

5-
This repository is a collection of notebooks about *Bayesian Machine Learning*. The following links display
6-
the notebooks via [nbviewer](https://nbviewer.jupyter.org/) to ensure a proper rendering of formulas. **Update:**
7-
PyMC3 and PyMC4 implementations are now available for some notebooks (more planned).
5+
This repository is a collection of notebooks related to *Bayesian Machine Learning*. The following links display
6+
the notebooks via [nbviewer](https://nbviewer.jupyter.org/) to ensure a proper rendering of formulas.
87

9-
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/latent_variable_models_part_1.ipynb)
10-
[Latent variable models, part 1: Gaussian mixture models and the EM algorithm](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent_variable_models_part_1.ipynb).
11-
Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models. Example
12-
implementation with plain NumPy/SciPy and scikit-learn for comparison (see also
13-
[PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent_variable_models_part_1_pymc3.ipynb)).
8+
- [Variational inference in Bayesian neural networks](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-neural-networks/bayesian_neural_networks.ipynb).
9+
Demonstrates how to implement a Bayesian neural network and variational inference of network parameters. Example implementation
10+
with Keras (see also
11+
[PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-neural-networks/bayesian_neural_networks_pymc4.ipynb)).
1412

15-
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/latent_variable_models_part_2.ipynb)
16-
[Latent variable models, part 2: Stochastic variational inference and variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent_variable_models_part_2.ipynb).
13+
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_2.ipynb)
14+
[Latent variable models, part 2: Stochastic variational inference and variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_2.ipynb).
1715
Introduction to stochastic variational inference with variational autoencoder as application example. Implementation
1816
with Tensorflow 2.x.
1917

20-
- [Variational inference in Bayesian neural networks](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian_neural_networks.ipynb). Demonstrates how to
21-
implement and train a Bayesian neural network using a variational inference approach. Example implementation with Keras (see also
22-
[PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/wip-bnn-pymc4/bayesian_neural_networks_pymc4.ipynb)).
18+
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_1.ipynb)
19+
[Latent variable models, part 1: Gaussian mixture models and the EM algorithm](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_1.ipynb).
20+
Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models. Example
21+
implementation with plain NumPy/SciPy and scikit-learn for comparison (see also
22+
[PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/latent-variable-models/latent_variable_models_part_1_pymc3.ipynb)).
2323

24-
- [Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian_linear_regression.ipynb). Introduction to Bayesian
25-
linear regression. Implementation from scratch with plain NumPy as well as usage of scikit-learn for comparison (see also
26-
[PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian_linear_regression_pymc4.ipynb) and
27-
[PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian_linear_regression_pymc3.ipynb)).
24+
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/bayesian-optimization/bayesian_optimization.ipynb)
25+
[Bayesian optimization](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-optimization/bayesian_optimization.ipynb).
26+
Introduction to Bayesian optimization. Example implementations with plain NumPy/SciPy as well as with libraries
27+
scikit-optimize and GPyOpt. Hyper-parameter tuning as application example.
2828

29-
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/gaussian_processes.ipynb)
30-
[Gaussian processes](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/gaussian_processes.ipynb).
29+
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/gaussian-processes/gaussian_processes.ipynb)
30+
[Gaussian processes](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/gaussian-processes/gaussian_processes.ipynb).
3131
Introduction to Gaussian processes. Example implementations with plain NumPy/SciPy as well as with libraries
3232
scikit-learn and GPy.
3333

34-
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/master/bayesian_optimization.ipynb)
35-
[Bayesian optimization](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian_optimization.ipynb).
36-
Introduction to Bayesian optimization. Example implementations with plain NumPy/SciPy as well as with libraries
37-
scikit-optimize and GPyOpt. Hyperparameter tuning as application example.
34+
- [Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-linear-regression/bayesian_linear_regression.ipynb).
35+
Introduction to Bayesian linear regression. Implementation from scratch with plain NumPy as well as usage of scikit-learn
36+
for comparison (see also
37+
[PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/maste/bayesian-linear-regressionr/bayesian_linear_regression_pymc4.ipynb) and
38+
[PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/bayesian-linear-regression/bayesian_linear_regression_pymc3.ipynb)).
3839

39-
- [Deep feature consistent variational autoencoder](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/variational_autoencoder_dfc.ipynb).
40+
- [Deep feature consistent variational autoencoder](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/autoencoder-applications/variational_autoencoder_dfc.ipynb).
4041
Describes how a perceptual loss can improve the quality of images generated by a variational autoencoder. Example
4142
implementation with Keras.
4243

43-
- [Conditional generation via Bayesian optimization in latent space](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/variational_autoencoder_opt.ipynb).
44+
- [Conditional generation via Bayesian optimization in latent space](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/master/autoencoder-applications/variational_autoencoder_opt.ipynb).
4445
Describes an approach for conditionally generating outputs with desired properties by doing Bayesian optimization in
45-
latent space of variational autoencoders. Example application implemented with Keras and GPyOpt.
46+
latent space learned by a variational autoencoder. Example application implemented with Keras and GPyOpt.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.

0 commit comments

Comments
 (0)