Replies: 1 comment 4 replies
-
|
Hey, as far as I remember the @rule Bernoulli(:p, Marginalisation) (m_out::Bernoulli,) = begin
@logscale ...
return Beta(..., ...)
end |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I'm trying to understand how to implement Bayesian Model Combination (BMC) in RxInfer, for a mixture of Bernoulli or Categorical variables (not too interested in the gaussian counterpart for now).
I found the description in the paper Automating Model Comparison in Factor Graphs The paper describes three possibilities, I think all of them implementable with RxInfer's
Mixturenode:m, inference returns a soft posterior overmq(m)is constrained to a Kronecker delta (hard selection)piwith a Dirichlet prior, shared across observations. Each observationngets its own selectorm[n] ~ Bernoulli(pi), andpiaccumulates evidence from all observations.It seems that BMC would allow me to find the best weighted combination of models rather than converging to a single best one.
I tried starting from the John/Jane coin toss example in the Universal Mixtures docs, which looks like this:
But as far as I can tell, this is BMS/BMA, not BMC: there is a single shared switch variable with a fixed prior probability.
If I understand correctly an BMC model should look more like:
but even adding
MeanField()andAddonLogScale(), which I think were implied by the errors, I cannot get this to run.So, my questions would be:
beta_model_maryexample actually BMA/BMS, or am I misreading it?picurrently supported through theMixturenode, and if so, what is the correct way to set up the model, constraints, and initialization?Thank you!
Beta Was this translation helpful? Give feedback.
All reactions