You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
### Bayesian layers and utilities to perform stochastic variational inference in PyTorch
5
5
@@ -38,9 +38,9 @@ The repository has implementations for the following Bayesian layers:
38
38
Please refer to [documentation](doc/bayesian_torch.layers.md#layers) of Bayesian layers for details.
39
39
40
40
Other features include:
41
+
-[x][dnn_to_bnn()](https://github.com/IntelLabs/bayesian-torch/blob/main/bayesian_torch/models/dnn_to_bnn.py#L127): An API to convert deterministic deep neural network (dnn) model of any architecture to Bayesian deep neural network (bnn) model, simplifying the model definition i.e. drop-in replacements of Convolutional, Linear and LSTM layers to corresponding Bayesian layers. This will enable seamless conversion of existing topology of larger models to Bayesian deep neural network models for extending towards uncertainty-aware applications.
42
+
-[x][MOPED](https://github.com/IntelLabs/bayesian-torch/blob/main/bayesian_torch/utils/util.py#L72): Specifying weight priors and variational posteriors in Bayesian neural networks with Empirical Bayes [[Krishnan et al. 2020](https://ojs.aaai.org/index.php/AAAI/article/view/5875)]
41
43
-[x][AvUC](https://github.com/IntelLabs/bayesian-torch/blob/main/bayesian_torch/utils/avuc_loss.py): Accuracy versus Uncertainty Calibration loss [[Krishnan and Tickoo 2020](https://proceedings.neurips.cc/paper/2020/file/d3d9446802a44259755d38e6d163e820-Paper.pdf)]
42
-
-[x][MOPED](https://github.com/IntelLabs/bayesian-torch/blob/main/bayesian_torch/utils/util.py#L72): Specifying weight priors and variational posteriors with Empirical Bayes [[Krishnan et al. 2020](https://ojs.aaai.org/index.php/AAAI/article/view/5875)]
43
-
-[x][dnn_to_bnn](https://github.com/IntelLabs/bayesian-torch/blob/main/bayesian_torch/models/dnn_to_bnn.py#L127): An API to convert deterministic deep neural network (dnn) model of any architecture to Bayesian deep neural network (bnn) model, simplifying the model definition i.e. drop-in replacements of Convolutional, Linear and LSTM layers to corresponding Bayesian layers. This will enable seamless conversion of existing topology of larger models to Bayesian deep neural network models for extending towards uncertainty-aware applications.
There are two ways to build Bayesian deep neural networks using Bayesian-Torch:
70
71
1. Convert an existing deterministic deep neural network (dnn) model to Bayesian deep neural network (bnn) model with dnn_to_bnn()
71
72
2. Define your custom model using the Bayesian layers ([Flipout](https://github.com/IntelLabs/bayesian-torch/tree/main/bayesian_torch/layers/flipout_layers) or [Reparameterization](https://github.com/IntelLabs/bayesian-torch/tree/main/bayesian_torch/layers/variational_layers))
72
73
73
-
(1) For instance to build Bayesian-ResNet18 from torchvision deterministic ResNet18 model:
74
+
(1) For instance, building Bayesian-ResNet18 from torchvision deterministic ResNet18 model is as simple as:
74
75
```
76
+
import torch
75
77
import torchvision
76
-
from bayesian_torch.models.dnn_to_bnn import dnn_to_bnn
78
+
from bayesian_torch.models.dnn_to_bnn import dnn_to_bnn, get_kl_loss
"type": "Reparameterization", # Flipout or Reparameterization
84
86
"moped_enable": False, # True to initialize mu/sigma from the pretrained dnn weights
85
-
"moped_delta": 0.2,
87
+
"moped_delta": 0.5,
86
88
}
87
89
88
90
model = torchvision.models.resnet18()
89
91
dnn_to_bnn(model, const_bnn_prior_parameters)
90
92
```
91
-
To use MOPED method, setting the prior and initializing variational parameters from a pretrained determined model (helps training convergence of larger models):
93
+
To use MOPED method, setting the prior and initializing variational parameters from a pretrained deterministic model (helps training convergence of larger models):
0 commit comments