You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -66,6 +68,7 @@ Calculates the Kullback-Leibler divergence from distribution normal Q (parametri
66
68
##### Returns
67
69
torch.Tensor of shape 0
68
70
71
+
<!--
69
72
## class BaseMixtureLayer_(torch.nn.Module)
70
73
Abstract class which inherits from BaseVariationalLayer_, powered with method to calculate the a KL divergence sample between two mixture of gaussians.
71
74
@@ -85,6 +88,7 @@ Calculates a sample of KL divergence between two mixture of gaussians (Q || P),
0 commit comments