@@ -187,7 +187,7 @@ rand(bn_gibbs, gsampler, 5)
187
187
188
188
BayesNets.jl supports parameter learning for an entire graph.
189
189
190
- ``` julia
190
+ ``` julia
191
191
fit (BayesNet, data, (:a => :b ), [StaticCPD{Normal}, LinearGaussianCPD])
192
192
```
193
193
@@ -223,7 +223,7 @@ Inference methods for discrete Bayesian networks can be used via the `infer` met
223
223
bn = DiscreteBayesNet()
224
224
push!(bn, DiscreteCPD(:a, [0.3,0.7]))
225
225
push!(bn, DiscreteCPD(:b, [0.2,0.8]))
226
- push!(bn, DiscreteCPD(:c, [:a, :b], [2,2],
226
+ push!(bn, DiscreteCPD(:c, [:a, :b], [2,2],
227
227
[Categorical([0.1,0.9]),
228
228
Categorical([0.2,0.8]),
229
229
Categorical([1.0,0.0]),
@@ -283,7 +283,7 @@ data[1:3,:] # only display a subset...
283
283
Here we use the K2 structure learning algorithm which runs in polynomial time but requires that we specify a topological node ordering.
284
284
285
285
``` @example bayesnet
286
- parameters = K2GraphSearch([:Species, :SepalLength, :SepalWidth, :PetalLength, :PetalWidth],
286
+ parameters = K2GraphSearch([:Species, :SepalLength, :SepalWidth, :PetalLength, :PetalWidth],
287
287
ConditionalLinearGaussianCPD,
288
288
max_n_parents=2)
289
289
bn = fit(BayesNet, data, parameters)
@@ -300,7 +300,7 @@ Changing the ordering will change the structure.
300
300
301
301
``` julia
302
302
CLG = ConditionalLinearGaussianCPD
303
- parameters = K2GraphSearch ([:Species , :PetalLength , :PetalWidth , :SepalLength , :SepalWidth ],
303
+ parameters = K2GraphSearch ([:Species , :PetalLength , :PetalWidth , :SepalLength , :SepalWidth ],
304
304
[StaticCPD{Categorical}, CLG, CLG, CLG, CLG],
305
305
max_n_parents= 2 )
306
306
fit (BayesNet, data, parameters)
@@ -311,7 +311,7 @@ A `ScoringFunction` allows for extracting a scoring metric for a CPD given data.
311
311
A ` GraphSearchStrategy ` defines a structure learning algorithm. The K2 algorithm is defined through ` K2GraphSearch ` and ` GreedyHillClimbing ` is implemented for discrete Bayesian networks and the Bayesian score:
312
312
313
313
``` @example bayesnet
314
- data = DataFrame(c=[1,1,1,1,2,2,2,2,3,3,3,3],
314
+ data = DataFrame(c=[1,1,1,1,2,2,2,2,3,3,3,3],
315
315
b=[1,1,1,2,2,2,2,1,1,2,1,1],
316
316
a=[1,1,1,2,1,1,2,1,1,2,1,1])
317
317
parameters = GreedyHillClimbing(ScoreComponentCache(data), max_n_parents=3, prior=UniformPrior())
@@ -339,6 +339,14 @@ A whole suite of features are supported for DiscreteBayesNets. Here, we illustra
339
339
We also detail obtaining a bayesian score for a network structure in the next section.
340
340
341
341
``` julia
342
+ bn = DiscreteBayesNet ()
343
+ push! (bn, DiscreteCPD (:hospital , [:a , :b ], [2 ,2 ],
344
+ [Categorical ([0.9 ,0.1 ]),
345
+ Categorical ([0.2 ,0.8 ]),
346
+ Categorical ([0.7 ,0.3 ]),
347
+ Categorical ([0.01 ,0.99 ]),
348
+ ]))
349
+
342
350
count (bn, :a , data) # 1
343
351
statistics (bn. dag, data) # 2
344
352
table (bn, :b ) # 3
@@ -363,12 +371,10 @@ TikzPictures.save(SVG("plot10"), plot) # hide
363
371
The bayesian score for a discrete-valued BayesNet can can be calculated based only on the structure and data (the CPDs do not need to be defined beforehand). This is implemented with a method of ``` bayesian_score ``` that takes in a directed graph, the names of the nodes and data.
364
372
365
373
``` @example bayesnet
366
- data = DataFrame(c=[1,1,1,1,2,2,2,2,3,3,3,3],
374
+ data = DataFrame(c=[1,1,1,1,2,2,2,2,3,3,3,3],
367
375
b=[1,1,1,2,2,2,2,1,1,2,1,1],
368
376
a=[1,1,1,2,1,1,2,1,1,2,1,1])
369
377
g = DAG(3)
370
378
add_edge!(g,1,2); add_edge!(g,2,3); add_edge!(g,1,3)
371
379
bayesian_score(g, [:a,:b,:c], data)
372
380
```
373
-
374
-
0 commit comments