Skip to content

Commit 120140e

Browse files
committed
added examples for discrete CPDS
1 parent 834158a commit 120140e

File tree

1 file changed

+14
-8
lines changed

1 file changed

+14
-8
lines changed

docs/src/usage.md

Lines changed: 14 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,7 @@ rand(bn_gibbs, gsampler, 5)
187187

188188
BayesNets.jl supports parameter learning for an entire graph.
189189

190-
```julia
190+
```julia
191191
fit(BayesNet, data, (:a=>:b), [StaticCPD{Normal}, LinearGaussianCPD])
192192
```
193193

@@ -223,7 +223,7 @@ Inference methods for discrete Bayesian networks can be used via the `infer` met
223223
bn = DiscreteBayesNet()
224224
push!(bn, DiscreteCPD(:a, [0.3,0.7]))
225225
push!(bn, DiscreteCPD(:b, [0.2,0.8]))
226-
push!(bn, DiscreteCPD(:c, [:a, :b], [2,2],
226+
push!(bn, DiscreteCPD(:c, [:a, :b], [2,2],
227227
[Categorical([0.1,0.9]),
228228
Categorical([0.2,0.8]),
229229
Categorical([1.0,0.0]),
@@ -283,7 +283,7 @@ data[1:3,:] # only display a subset...
283283
Here we use the K2 structure learning algorithm which runs in polynomial time but requires that we specify a topological node ordering.
284284

285285
```@example bayesnet
286-
parameters = K2GraphSearch([:Species, :SepalLength, :SepalWidth, :PetalLength, :PetalWidth],
286+
parameters = K2GraphSearch([:Species, :SepalLength, :SepalWidth, :PetalLength, :PetalWidth],
287287
ConditionalLinearGaussianCPD,
288288
max_n_parents=2)
289289
bn = fit(BayesNet, data, parameters)
@@ -300,7 +300,7 @@ Changing the ordering will change the structure.
300300

301301
```julia
302302
CLG = ConditionalLinearGaussianCPD
303-
parameters = K2GraphSearch([:Species, :PetalLength, :PetalWidth, :SepalLength, :SepalWidth],
303+
parameters = K2GraphSearch([:Species, :PetalLength, :PetalWidth, :SepalLength, :SepalWidth],
304304
[StaticCPD{Categorical}, CLG, CLG, CLG, CLG],
305305
max_n_parents=2)
306306
fit(BayesNet, data, parameters)
@@ -311,7 +311,7 @@ A `ScoringFunction` allows for extracting a scoring metric for a CPD given data.
311311
A `GraphSearchStrategy` defines a structure learning algorithm. The K2 algorithm is defined through `K2GraphSearch` and `GreedyHillClimbing` is implemented for discrete Bayesian networks and the Bayesian score:
312312

313313
```@example bayesnet
314-
data = DataFrame(c=[1,1,1,1,2,2,2,2,3,3,3,3],
314+
data = DataFrame(c=[1,1,1,1,2,2,2,2,3,3,3,3],
315315
b=[1,1,1,2,2,2,2,1,1,2,1,1],
316316
a=[1,1,1,2,1,1,2,1,1,2,1,1])
317317
parameters = GreedyHillClimbing(ScoreComponentCache(data), max_n_parents=3, prior=UniformPrior())
@@ -339,6 +339,14 @@ A whole suite of features are supported for DiscreteBayesNets. Here, we illustra
339339
We also detail obtaining a bayesian score for a network structure in the next section.
340340

341341
```julia
342+
bn = DiscreteBayesNet()
343+
push!(bn, DiscreteCPD(:hospital, [:a, :b], [2,2],
344+
[Categorical([0.9,0.1]),
345+
Categorical([0.2,0.8]),
346+
Categorical([0.7,0.3]),
347+
Categorical([0.01,0.99]),
348+
]))
349+
342350
count(bn, :a, data) # 1
343351
statistics(bn.dag, data) # 2
344352
table(bn, :b) # 3
@@ -363,12 +371,10 @@ TikzPictures.save(SVG("plot10"), plot) # hide
363371
The bayesian score for a discrete-valued BayesNet can can be calculated based only on the structure and data (the CPDs do not need to be defined beforehand). This is implemented with a method of ```bayesian_score``` that takes in a directed graph, the names of the nodes and data.
364372

365373
```@example bayesnet
366-
data = DataFrame(c=[1,1,1,1,2,2,2,2,3,3,3,3],
374+
data = DataFrame(c=[1,1,1,1,2,2,2,2,3,3,3,3],
367375
b=[1,1,1,2,2,2,2,1,1,2,1,1],
368376
a=[1,1,1,2,1,1,2,1,1,2,1,1])
369377
g = DAG(3)
370378
add_edge!(g,1,2); add_edge!(g,2,3); add_edge!(g,1,3)
371379
bayesian_score(g, [:a,:b,:c], data)
372380
```
373-
374-

0 commit comments

Comments
 (0)