You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
### Bayesian layers and utilities to perform stochastic variational inference in PyTorch
3
+
<imgsrc="assets/bayesian-torch.png"width="500px">
4
+
<h2 >
5
+
A library for Bayesian neural network layers and uncertainty estimation in Deep Learning </a>
6
+
</h2>
5
7
6
-
Bayesian-Torch is a library of neural network layers and utilities extending the core of PyTorch to enable the user to perform stochastic variational inference in Bayesian deep neural networks.
7
-
Bayesian-Torch is designed to be flexible and seamless in extending a deterministic deep neural network architecture to corresponding Bayesian form by simply replacing the deterministic layers with Bayesian layers.
The repository has implementations for the following Bayesian layers:
11
-
-[x]**[Variational layers with reparameterized Monte Carlo estimators](https://github.com/IntelLabs/bayesian-torch/tree/main/bayesian_torch/layers/variational_layers)**[[Blundell et al. 2015](https://arxiv.org/abs/1505.05424)]
21
+
___
22
+
23
+
Bayesian-Torch is a library of neural network layers and utilities extending the core of PyTorch to enable Bayesian inference in deep learning models to quantify principled uncertainty estimates in model predictions.
24
+
25
+
## Overview
26
+
Bayesian-Torch is designed to be flexible and enables seamless extension of deterministic deep neural network model to corresponding Bayesian form by simply replacing the deterministic layers with Bayesian layers. It enables user to perform stochastic variational inference in deep neural networks.
27
+
28
+
**Bayesian layers:**
29
+
30
+
***[Variational layers with reparameterized Monte Carlo estimators](https://github.com/IntelLabs/bayesian-torch/tree/main/bayesian_torch/layers/variational_layers)**[[Blundell et al. 2015](https://arxiv.org/abs/1505.05424)]
-[x]**[Variational layers with Flipout Monte Carlo estimators](https://github.com/IntelLabs/bayesian-torch/tree/main/bayesian_torch/layers/flipout_layers)**[[Wen et al. 2018](https://arxiv.org/abs/1803.04386)]
37
+
***[Variational layers with Flipout Monte Carlo estimators](https://github.com/IntelLabs/bayesian-torch/tree/main/bayesian_torch/layers/flipout_layers)**[[Wen et al. 2018](https://arxiv.org/abs/1803.04386)]
Please refer to [documentation](doc/bayesian_torch.layers.md#layers) of Bayesian layers for details.
40
-
-->
41
-
42
-
Other features include:
43
-
-[x][dnn_to_bnn()](https://github.com/IntelLabs/bayesian-torch/blob/main/bayesian_torch/models/dnn_to_bnn.py#L127): An API to convert deterministic deep neural network (dnn) model of any architecture to Bayesian deep neural network (bnn) model, simplifying the model definition i.e. drop-in replacements of Convolutional, Linear and LSTM layers to corresponding Bayesian layers. This will enable seamless conversion of existing topology of larger models to Bayesian deep neural network models for extending towards uncertainty-aware applications.
44
-
-[x][MOPED](https://github.com/IntelLabs/bayesian-torch/blob/main/bayesian_torch/utils/util.py#L72): Specifying weight priors and variational posteriors in Bayesian neural networks with Empirical Bayes [[Krishnan et al. 2020](https://ojs.aaai.org/index.php/AAAI/article/view/5875)]
45
-
-[x][AvUC](https://github.com/IntelLabs/bayesian-torch/blob/main/bayesian_torch/utils/avuc_loss.py): Accuracy versus Uncertainty Calibration loss [[Krishnan and Tickoo 2020](https://proceedings.neurips.cc/paper/2020/file/d3d9446802a44259755d38e6d163e820-Paper.pdf)]
45
+
**Key features:**
46
+
*[dnn_to_bnn()](https://github.com/IntelLabs/bayesian-torch/blob/main/bayesian_torch/models/dnn_to_bnn.py#L127): An API to convert deterministic deep neural network (dnn) model of any architecture to Bayesian deep neural network (bnn) model, simplifying the model definition i.e. drop-in replacements of Convolutional, Linear and LSTM layers to corresponding Bayesian layers. This will enable seamless conversion of existing topology of larger models to Bayesian deep neural network models for extending towards uncertainty-aware applications.
47
+
*[MOPED](https://github.com/IntelLabs/bayesian-torch/blob/main/bayesian_torch/utils/util.py#L72): Specifying weight priors and variational posteriors in Bayesian neural networks with Empirical Bayes [[Krishnan et al. 2020](https://ojs.aaai.org/index.php/AAAI/article/view/5875)]
48
+
*[AvUC](https://github.com/IntelLabs/bayesian-torch/blob/main/bayesian_torch/utils/avuc_loss.py): Accuracy versus Uncertainty Calibration loss [[Krishnan and Tickoo 2020](https://proceedings.neurips.cc/paper/2020/file/d3d9446802a44259755d38e6d163e820-Paper.pdf)]
46
49
47
-
## Installation
50
+
## Installing Bayesian-Torch
48
51
<!--
49
-
**To install from PyPI:**
52
+
**To install core library using `pip`:**
50
53
```
51
54
pip install bayesian-torch
52
55
```
@@ -68,10 +71,11 @@ Dependencies:
68
71
- pip install tensorboard
69
72
- pip install scikit-learn
70
73
-->
74
+
71
75
## Usage
72
76
There are two ways to build Bayesian deep neural networks using Bayesian-Torch:
73
-
1. Convert an existing deterministic deep neural network (dnn) model to Bayesian deep neural network (bnn) model with dnn_to_bnn()
74
-
2. Define your custom model using the Bayesian layers ([Flipout](https://github.com/IntelLabs/bayesian-torch/tree/main/bayesian_torch/layers/flipout_layers) or [Reparameterization](https://github.com/IntelLabs/bayesian-torch/tree/main/bayesian_torch/layers/variational_layers))
77
+
1. Convert an existing deterministic deep neural network (dnn) model to Bayesian deep neural network (bnn) model with dnn_to_bnn() API
78
+
2. Define your custom model using the Bayesian layers ([Reparameterization](https://github.com/IntelLabs/bayesian-torch/tree/main/bayesian_torch/layers/variational_layers) or [Flipout](https://github.com/IntelLabs/bayesian-torch/tree/main/bayesian_torch/layers/flipout_layers))
75
79
76
80
(1) For instance, building Bayesian-ResNet18 from torchvision deterministic ResNet18 model is as simple as:
77
81
```
@@ -92,7 +96,7 @@ const_bnn_prior_parameters = {
92
96
model = torchvision.models.resnet18()
93
97
dnn_to_bnn(model, const_bnn_prior_parameters)
94
98
```
95
-
To use MOPED method, setting the prior and initializing variational parameters from a pretrained deterministic model (helps training convergence of larger models):
99
+
To use MOPED method i.e. setting the prior and initializing variational parameters from a pretrained deterministic model (helps training convergence of larger models):
96
100
```
97
101
const_bnn_prior_parameters = {
98
102
"prior_mu": 0.0,
@@ -234,7 +238,7 @@ MOdel Priors with Empirical Bayes using DNN (MOPED)
234
238
}
235
239
```
236
240
237
-
This code is intended for researchers and developers, enables to quantify principled uncertainty estimates from deep neural network predictions using stochastic variational inference in Bayesian neural networks.
241
+
This library and code is intended for researchers and developers, enables to quantify principled uncertainty estimates from deep learning model predictions using stochastic variational inference in Bayesian neural networks.
238
242
Feedbacks, issues and contributions are welcome. Email to <[email protected]> for any questions.
0 commit comments