Skip to content
Merged
Show file tree
Hide file tree
Changes from 15 commits
Commits
Show all changes
31 commits
Select commit Hold shift + click to select a range
8b4b755
Add OR Node
Sepideh-Adamiat Jul 14, 2022
082065c
Add rules for in1 and in2
Sepideh-Adamiat Jul 15, 2022
74ddf64
add And and Implication node
Chengfeng-Jia Jul 15, 2022
b7548ec
add Implication node
Chengfeng-Jia Jul 15, 2022
37e4706
Update in2
Sepideh-Adamiat Jul 15, 2022
6446033
Add Not node
Sepideh-Adamiat Jul 15, 2022
7b01a64
Fix the test for the nodes
Sepideh-Adamiat Jul 18, 2022
082a5a8
add test for AND_IMPL
Chengfeng-Jia Jul 18, 2022
eec109f
add test for AND_IMPL
Chengfeng-Jia Jul 18, 2022
a3ed124
Add marginal rules
albertpod Jul 19, 2022
ed56cba
Make format
albertpod Jul 19, 2022
fca65ef
add more test exampls
Chengfeng-Jia Jul 19, 2022
547df59
Update rules
albertpod Jul 20, 2022
b14b302
Merge branches
albertpod Jul 20, 2022
03d9233
Make format
albertpod Jul 20, 2022
52025ae
Add demo
albertpod Jul 29, 2022
b07e976
Merge branch 'master' into dev_logic
albertpod Jul 29, 2022
5d58184
Update
albertpod Jul 29, 2022
6f68b7e
Fix tests
albertpod Jul 29, 2022
b142440
Rename IMPL to IMPLY
albertpod Aug 1, 2022
d91613d
test: fix contingency matrix eltype conversion
bvdmitri Aug 1, 2022
19449ff
docs: add new notebook to the documentation examples
bvdmitri Aug 2, 2022
08d406d
update ordering
bartvanerp Aug 2, 2022
6d07229
update notebook output
bvdmitri Aug 2, 2022
1ef5b94
Merge branch 'dev_logic' of github.com:biaslab/ReactiveMP.jl into dev…
bvdmitri Aug 2, 2022
581dd8b
Update runtests.jl
bartvanerp Aug 3, 2022
afa9870
Add descriptions
albertpod Aug 4, 2022
8a435ad
Make format
albertpod Aug 4, 2022
8ec8000
Fix comments
albertpod Aug 4, 2022
4bfdba3
Update docs
albertpod Aug 5, 2022
9cda713
Update examples
albertpod Aug 5, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions src/ReactiveMP.jl
Original file line number Diff line number Diff line change
Expand Up @@ -128,11 +128,15 @@ include("nodes/bifm_helper.jl")
include("nodes/probit.jl")
include("nodes/flow/flow.jl")
include("nodes/poisson.jl")
include("nodes/or.jl")
include("nodes/not.jl")

# Deterministic nodes
include("nodes/addition.jl")
include("nodes/subtraction.jl")
include("nodes/multiplication.jl")
include("nodes/and.jl")
include("nodes/implication.jl")

include("rules/prototypes.jl")

Expand Down
5 changes: 5 additions & 0 deletions src/nodes/and.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
export AND

struct AND end

@node AND Deterministic [out, in1, in2]
5 changes: 5 additions & 0 deletions src/nodes/implication.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
export IMPL

struct IMPL end

@node IMPL Deterministic [out, in1, in2]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is an implication node? As a user I would not know what it does so it either requires: 1) docstrings/documentation or 2) a name change.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think @bartvanerp's comment is critical and should be addressed. A possible fix would include the truth tables for all logical operations.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps good to add docstrings for every node. The implication is as "famous" as OR or AND operators.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah I see that is quite common apparently. Could we use the abbreviation IMPLY instead according to https://en.wikipedia.org/wiki/Logical_connective?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Chengfeng-Jia, what is your opinion on renaming the node?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think IMPLY may be a proper name. Besides, if we decide to use & instead of AND, can we use $\rightarrow$ to represent Implication.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For the Pointmass part, thanks for @bartvanerp explain. I wonder if Pointmass can be regarded as a special case of Bernoulli distribution. For example, $x=1$ means $\mathcal{B}er\left(x \bigg| 1\right)$.

5 changes: 5 additions & 0 deletions src/nodes/not.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
export NOT

struct NOT end

@node NOT Deterministic [out, in1]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tests fail because they expect to see :in :)

Suggested change
@node NOT Deterministic [out, in1]
@node NOT Deterministic [out, in]

This change also requires to fix all rules from :in1 to :in

5 changes: 5 additions & 0 deletions src/nodes/or.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
export OR

struct OR end

@node OR Deterministic [out, in1, in2]
8 changes: 8 additions & 0 deletions src/rules/and/in1.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
@rule AND(:in1, Marginalisation) (
m_out::Bernoulli,
m_in2::Bernoulli
) = begin
pout, pin2 = mean(m_out), mean(m_in2)

return Bernoulli((1 - pout - pin2 + 2 * pout * pin2) / (2 - 2 * pout - pin2 + 2 * pout * pin2))
end
3 changes: 3 additions & 0 deletions src/rules/and/in2.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
@rule AND(:in2, Marginalisation) (m_out::Bernoulli, m_in1::Bernoulli, meta::Any) = begin
return @call_rule AND(:in1, Marginalisation) (m_out = m_out, m_in2 = m_in1, meta = meta)
end
8 changes: 8 additions & 0 deletions src/rules/and/marginals.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
@marginalrule AND(:in1_in2) (
m_out::Bernoulli,
m_in1::Bernoulli,
m_in2::Bernoulli
) = begin
pin1, pin2, pout = mean(m_in1), mean(m_in2), mean(m_out)
return Contingency([(1-pin1)*(1-pin2)*(1-pout) (1-pin1)*pin2*(1-pout); pin1*(1-pin2)*(1-pout) pin1*pin2*pout])
end
8 changes: 8 additions & 0 deletions src/rules/and/out.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
@rule AND(:out, Marginalisation) (
m_in1::Bernoulli,
m_in2::Bernoulli
) = begin
pin1, pin2 = mean(m_in1), mean(m_in2)

return Bernoulli(pin1 * pin2)
end
8 changes: 8 additions & 0 deletions src/rules/implication/in1.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
@rule typeof(IMPL)(:in1, Marginalisation) (
m_out::Bernoulli,
m_in2::Bernoulli
) = begin
pout, pin2 = mean(m_out), mean(m_in2)

return Bernoulli((1 - pout - pin2 + 2 * pout * pin2) / (1 - pin2 + 2 * pout * pin2))
end
8 changes: 8 additions & 0 deletions src/rules/implication/in2.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
@rule typeof(IMPL)(:in2, Marginalisation) (
m_out::Bernoulli,
m_in1::Bernoulli
) = begin
pout, pin1 = mean(m_out), mean(m_in1)

return Bernoulli((pout) / (2 * pout + pin1 - 2 * pout * pin1))
end
8 changes: 8 additions & 0 deletions src/rules/implication/marginals.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
@marginalrule IMPL(:in1_in2) (
m_out::Bernoulli,
m_in1::Bernoulli,
m_in2::Bernoulli
) = begin
pin1, pin2, pout = mean(m_in1), mean(m_in2), mean(m_out)
return Contingency([(1-pin1)*pout*(1-pin2) (1-pin1)*pin2*pout; pin1*(1-pin2)*(1-pout) pin1*pin2*pout])
end
8 changes: 8 additions & 0 deletions src/rules/implication/out.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
@rule typeof(IMPL)(:out, Marginalisation) (
m_in1::Bernoulli,
m_in2::Bernoulli
) = begin
pin1, pin2 = mean(m_in1), mean(m_in2)

return Bernoulli(1 - pin1 + pin1 * pin2)
end
3 changes: 3 additions & 0 deletions src/rules/not/in1.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
export rule

@rule NOT(:in1, Marginalisation) (m_out::Bernoulli,) = Bernoulli(1 - mean(m_out))
7 changes: 7 additions & 0 deletions src/rules/not/marginals.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
@marginalrule NOT(:in1) (
m_out::Bernoulli,
m_in1::Bernoulli
) = begin
pin, pout = mean(m_in1), mean(m_out)
return Bernoulli(pin * (1 - pout) / (pin * (1 - pout) + pout * (1 - pin)))
end
3 changes: 3 additions & 0 deletions src/rules/not/out.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
export rule

@rule NOT(:out, Marginalisation) (m_in1::Bernoulli,) = Bernoulli(1 - mean(m_in1))
7 changes: 7 additions & 0 deletions src/rules/or/in1.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
@rule OR(:in1, Marginalisation) (
m_out::Bernoulli,
m_in2::Bernoulli
) = begin
pin2, pout = mean(m_in2), mean(m_out)
return Bernoulli(pout / (1 - pin2 + 2 * pin2 * pout))
end
3 changes: 3 additions & 0 deletions src/rules/or/in2.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
@rule OR(:in2, Marginalisation) (m_out::Bernoulli, m_in1::Bernoulli) = begin
return @call_rule typeof(OR)(:in1, Marginalisation) (m_out = m_out, m_in2 = m_in1, meta = meta)
end
8 changes: 8 additions & 0 deletions src/rules/or/marginals.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
@marginalrule OR(:in1_in2) (
m_out::Bernoulli,
m_in1::Bernoulli,
m_in2::Bernoulli
) = begin
pin1, pin2, pout = mean(m_in1), mean(m_in2), mean(m_out)
return Contingency([(1-pin1)*(1-pin2)*(1-pout) (1-pin1)*pin2*pout; pin1*(1-pin2)*pout pin1*pin2*pout])
end
8 changes: 8 additions & 0 deletions src/rules/or/out.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
@rule OR(:out, Marginalisation) (
m_in1::Bernoulli,
m_in2::Bernoulli
) = begin
pin1, pin2 = mean(m_in1), mean(m_in2)

return Bernoulli(pin1 + pin2 - pin1 * pin2)
end
19 changes: 19 additions & 0 deletions src/rules/prototypes.jl
Original file line number Diff line number Diff line change
Expand Up @@ -127,3 +127,22 @@ include("bifm_helper/out.jl")
include("poisson/l.jl")
include("poisson/marginals.jl")
include("poisson/out.jl")

include("or/in1.jl")
include("or/in2.jl")
include("or/out.jl")
include("or/marginals.jl")

include("not/in1.jl")
include("not/out.jl")
include("not/marginals.jl")

include("and/in1.jl")
include("and/in2.jl")
include("and/out.jl")
include("and/marginals.jl")

include("implication/in1.jl")
include("implication/in2.jl")
include("implication/out.jl")
include("implication/marginals.jl")
20 changes: 20 additions & 0 deletions test/nodes/test_and.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
module OrNodeTest

using Test
using ReactiveMP
using Random
import ReactiveMP: @test_rules

@testset "AndNode" begin
@testset "Creation" begin
node = make_node(AND)

@test functionalform(node) === AND
@test sdtype(node) === Deterministic()
@test name.(interfaces(node)) === (:out, :in1, :in2)
@test factorisation(node) === ((1, 2, 3),)
@test localmarginalnames(node) === (:out_in1_in2,)
@test metadata(node) === nothing
end
end
end
20 changes: 20 additions & 0 deletions test/nodes/test_implication.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
module OrNodeTest
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
module OrNodeTest
module ImplicationNodeTest


using Test
using ReactiveMP
using Random
import ReactiveMP: @test_rules

@testset "ImplicationNode" begin
@testset "Creation" begin
node = make_node(IMPL)

@test functionalform(node) === IMPL
@test sdtype(node) === Deterministic()
@test name.(interfaces(node)) === (:out, :in1, :in2)
@test factorisation(node) === ((1, 2, 3),)
@test localmarginalnames(node) === (:out_in1_in2,)
@test metadata(node) === nothing
end
end
end
19 changes: 19 additions & 0 deletions test/nodes/test_not.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
module NotNodeTest

using Test
using ReactiveMP
using Random
import ReactiveMP: @test_rules

@testset "NotNode" begin
@testset "Creation" begin
node = make_node(NOT)

@test functionalform(node) === NOT
@test sdtype(node) === Deterministic()
@test name.(interfaces(node)) === (:out, :in)
@test factorisation(node) === ((1, 2),)
@test metadata(node) === nothing
end
end
end
20 changes: 20 additions & 0 deletions test/nodes/test_or.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
module OrNodeTest

using Test
using ReactiveMP
using Random
import ReactiveMP: @test_rules

@testset "OrNode" begin
@testset "Creation" begin
node = make_node(OR)

@test functionalform(node) === OR
@test sdtype(node) === Deterministic()
@test name.(interfaces(node)) === (:out, :in1, :in2)
@test factorisation(node) === ((1, 2, 3),)
@test localmarginalnames(node) === (:out_in1_in2,)
@test metadata(node) === nothing
end
end
end
22 changes: 22 additions & 0 deletions test/rules/and/test_in1.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
module RulesANDIn1Test

using Test
using ReactiveMP
using Random
import ReactiveMP: @test_rules

@testset "rules:AND:in1" begin
@testset "Belief Propagation: (m_out::Bernoulli, m_in2::Bernoulli)" begin
@test_rules [with_float_conversions = true] AND(:in1, Marginalisation) [
(
input = (m_out = Bernoulli(0.6), m_in2 = Bernoulli(0.5)),
output = Bernoulli(0.5 / 0.9)
),
(
input = (m_out = Bernoulli(0.3), m_in2 = Bernoulli(0.4)),
output = Bernoulli(0.54 / 1.24)
)
]
end
end
end
22 changes: 22 additions & 0 deletions test/rules/and/test_in2.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
module RulesANDIn2Test

using Test
using ReactiveMP
using Random
import ReactiveMP: @test_rules

@testset "rules:AND:in2" begin
@testset "Belief Propagation: (m_out::Bernoulli, m_in1::Bernoulli)" begin
@test_rules [with_float_conversions = true] AND(:in2, Marginalisation) [
(
input = (m_out = Bernoulli(0.6), m_in1 = Bernoulli(0.5)),
output = Bernoulli(0.5 / 0.9)
),
(
input = (m_out = Bernoulli(0.3), m_in1 = Bernoulli(0.4)),
output = Bernoulli(0.54 / 1.24)
)
]
end
end
end
32 changes: 32 additions & 0 deletions test/rules/and/test_marginals.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
module RulesANDMarginalsTest

using Test
using ReactiveMP
using Random
import ReactiveMP: @test_rules, @test_marginalrules

@testset "rules:AND:marginals" begin
@testset ":in1_in2 (m_out::Bernoulli, m_in1::Bernoulli, m_in2::Bernoulli)" begin
@test_marginalrules [with_float_conversions = false] AND(:in1_in2) [
(
input = (
m_out = Bernoulli(0.5),
m_in1 = Bernoulli(0.5),
m_in2 = Bernoulli(0.5)
),
output = (Contingency([0.5^3 0.5^3; 0.5^3 0.5^3])
)
),
(
input = (
m_out = Bernoulli(0.2),
m_in1 = Bernoulli(0.8),
m_in2 = Bernoulli(0.4)
),
output = (Contingency([0.2*0.8*0.6 0.2*0.8*0.4; 0.8*0.8*0.6 0.2*0.8*0.4])
)
)
]
end
end
end
22 changes: 22 additions & 0 deletions test/rules/and/test_out.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
module RulesANDOutTest

using Test
using ReactiveMP
using Random
import ReactiveMP: @test_rules

@testset "rules:AND:out" begin
@testset "Belief Propagation: (m_in1::Bernoulli, m_in2::Bernoulli)" begin
@test_rules [with_float_conversions = true] AND(:out, Marginalisation) [
(
input = (m_in1 = Bernoulli(0.3), m_in2 = Bernoulli(0.5)),
output = Bernoulli(0.15)
),
(
input = (m_in1 = Bernoulli(0.4), m_in2 = Bernoulli(0.3)),
output = Bernoulli(0.12)
)
]
end
end
end
22 changes: 22 additions & 0 deletions test/rules/implication/test_in1.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
module RulesImplicationIn1Test

using Test
using ReactiveMP
using Random
import ReactiveMP: @test_rules

@testset "rules:IMPL:in1" begin
@testset "Belief Propagation: (m_out::Bernoulli, m_in2::Bernoulli)" begin
@test_rules [with_float_conversions = true] IMPL(:in1, Marginalisation) [
(
input = (m_out = Bernoulli(0.6), m_in2 = Bernoulli(0.5)),
output = Bernoulli(0.5 / 1.1)
),
(
input = (m_out = Bernoulli(0.2), m_in2 = Bernoulli(0.5)),
output = Bernoulli(0.5 / 0.7)
)
]
end
end
end
Loading