Skip to content

Commit 1057bfb

Browse files
Update bayesian_torch.layers.md
1 parent 035460a commit 1057bfb

File tree

1 file changed

+8
-2
lines changed

1 file changed

+8
-2
lines changed

doc/bayesian_torch.layers.md

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,10 @@ A set of Bayesian neural network layers to perform stochastic variational infere
33

44
- Variational layers with reparameterized Monte Carlo estimators [[Blundell et al. 2015](https://arxiv.org/abs/1505.05424)]
55
- Variational layers with Flipout Monte Carlo estimators [[Wen et al. 2018](https://arxiv.org/abs/1803.04386)]
6+
<!--
67
- Radial BNN layers [[Farquhar et al. 2020](https://arxiv.org/abs/1907.00865)]
78
- Variational layers with Gaussian mixture model (GMM) posteriors using reparameterized Monte Carlo estimators (in pre-alpha)
9+
-->
810

911
# Layers
1012

@@ -29,7 +31,7 @@ A set of Bayesian neural network layers to perform stochastic variational infere
2931
* [ConvTranspose3dFlipout](#class-convtranspose3dflipout)
3032
* [LSTMFlipout](#class-lstmflipout)
3133

32-
34+
<!--
3335
* [LinearRadial](#class-linearradial)
3436
* [Conv1dRadial](#class-conv1dradial)
3537
* [Conv2dRadial](#class-conv2dradial)
@@ -48,7 +50,7 @@ A set of Bayesian neural network layers to perform stochastic variational infere
4850
* [ConvTranspose2dMixture](#class-convtranspose2dmixture)
4951
* [ConvTranspose3dMixture](#class-convtranspose3dmixture)
5052
* [LSTMMixture](#class-lstmmixture)
51-
53+
-->
5254

5355

5456

@@ -66,6 +68,7 @@ Calculates the Kullback-Leibler divergence from distribution normal Q (parametri
6668
##### Returns
6769
torch.Tensor of shape 0
6870

71+
<!--
6972
## class BaseMixtureLayer_(torch.nn.Module)
7073
Abstract class which inherits from BaseVariationalLayer_, powered with method to calculate the a KL divergence sample between two mixture of gaussians.
7174
@@ -85,6 +88,7 @@ Calculates a sample of KL divergence between two mixture of gaussians (Q || P),
8588
8689
##### Returns
8790
torch.Tensor of shape 0
91+
-->
8892

8993
## class LinearReparameterization
9094
### bayesian_torch.layers.LinearReparameterization(in_features, out_features, prior_mean, prior_variance, posterior_mu_init, posterior_rho_init, bias=True)
@@ -539,6 +543,7 @@ Samples the weights with Flipout and performs LSTM feedforward operation.
539543

540544
---
541545

546+
<!--
542547
## class LinearRadial
543548
### bayesian_torch.layers.LinearRadial(in_features, out_features, prior_mean, prior_variance, posterior_mu_init, posterior_rho_init, bias=True)
544549
#### Parameters:
@@ -1020,3 +1025,4 @@ Samples the weights with mixture (gaussian bimodal) reparameterization and perfo
10201025
, float corresponding to KL divergence from the samples weights distribution to the prior
10211026
10221027
---
1028+
-->

0 commit comments

Comments
 (0)