Skip to content

Commit cc60832

Browse files
committed
Added or updated Readmes for all epp examples.
Signed-off-by: Marvin Hansen <[email protected]>
1 parent 7bf30fc commit cc60832

File tree

12 files changed

+190
-4
lines changed

12 files changed

+190
-4
lines changed

examples/epp_cate/README.md

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
# EPP Example: Conditional Average Treatment Effect (CATE)
2+
3+
This crate demonstrates how the `DeepCausality` library, which implements the Effect Propagation Process (EPP), can model and calculate the Conditional Average Treatment Effect (CATE).
4+
5+
Specifically, this example answers the question: **"What is the average effect of a new medication on blood pressure, specifically for the subgroup of patients over the age of 65?"**
6+
7+
This showcases how the EPP's architecture naturally handles concepts from the Rubin Causal Model (RCM), such as potential outcomes, through its powerful contextual reasoning capabilities. This aligns with the principles outlined in Section 5.16 of the EPP documentation.
8+
9+
## How to Run
10+
11+
From within the `examples/epp_cate` directory, run:
12+
13+
```bash
14+
cargo run --bin example-cate
15+
```
16+
17+
---
18+
19+
### How It Works: Mapping CATE to EPP Concepts
20+
21+
The core idea behind CATE is to estimate the average effect of a treatment for a specific subset of a population. The EPP achieves this through the following mechanisms:
22+
23+
1. **Causal Logic (`drug_effect_logic`):**
24+
The fundamental effect of the drug is encapsulated in a single, reusable `Causaloid`. This causaloid uses a `ContextualCausalFn`, a function that can inspect the context it's evaluated against. Its logic is simple: if it finds a `drug_administered` flag in its context, it returns a numerical effect (e.g., -10.0 for a 10-point drop in blood pressure); otherwise, it returns zero.
25+
26+
2. **Population and Subgroup Selection:**
27+
The entire patient population is represented as a `Vec<BaseContext>`, where each `Context` is a self-contained representation of an individual, holding their specific attributes (like `age` and `initial_blood_pressure`) as `Datoid`s. We create our subgroup of interest by simply filtering this vector to include only the contexts of patients older than 65.
28+
29+
3. **Potential Outcomes via Contextual Alternation:**
30+
This is the key step. To calculate the Individual Treatment Effect (ITE) for each person in the subgroup, we must simulate two parallel realities: one where they received the drug, and one where they didn't. The EPP models this cleanly using **Contextual Alternation**:
31+
32+
* **Treatment Context (`Y(1)`):** For each patient, we clone their original context and *add* a `Datoid` indicating the drug was administered. Evaluating our `Causaloid` against this context yields the *potential outcome under treatment*.
33+
* **Control Context (`Y(0)`):** We clone the patient's context again, this time adding a `Datoid` indicating the drug was *not* administered. Evaluating the *exact same* `Causaloid` against this second context yields the *potential outcome under control*.
34+
35+
4. **Aggregation to CATE:**
36+
By subtracting the control outcome from the treatment outcome (`Y(1) - Y(0)`), we get the ITE for each individual. The CATE for the subgroup is then simply the average of all these ITEs.
37+
38+
### Conclusion
39+
40+
This example highlights a core strength of the Effect Propagation Process: the explicit separation of **causal logic** (the `Causaloid`) from the **state of the world** (the `Context`). This separation makes it trivial to perform powerful counterfactual reasoning by creating alternate contexts and evaluating the same immutable causal laws against them, providing a robust foundation for advanced causal inference.
41+
42+
## Reference
43+
44+
For more information on the EPP, please see chapter 5 in the EPP document:
45+
https://github.com/deepcausality-rs/papers/blob/main/effect_propagation_process/epp.pdf
File renamed without changes.

examples/epp_csm/README.md

Lines changed: 46 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,46 @@
1+
# EPP Example: Causal State Machine (CSM)
2+
3+
This crate demonstrates how the `DeepCausality` library, which implements the Effect Propagation Process (EPP), can be used to build a Causal State Machine (CSM).
4+
5+
Specifically, this example models a simple industrial monitoring system with three sensors: smoke, fire, and explosion. Each sensor is represented by a `CausalState` that, when its conditions are met, triggers a corresponding `CausalAction` (e.g., raising an alert).
6+
7+
This showcases how the EPP's architecture provides a formal bridge between causal reasoning and deterministic intervention, a concept that aligns with Rung 2 (Intervention) of Pearl's Ladder of Causation.
8+
9+
## How to Run
10+
11+
From within the `examples/csm` directory, run:
12+
13+
```bash
14+
cargo run --bin example-csm
15+
```
16+
17+
---
18+
19+
### How It Works: Mapping CSM Concepts to EPP
20+
21+
The CSM provides a mechanism to link causal inferences to real-world actions. It is a collection of state-action pairs, where each state's activation is determined by a causal model.
22+
23+
1. **Causal Logic as `Causaloid`s:**
24+
The trigger condition for each sensor is encapsulated in a `Causaloid`. For example, the `smoke_sensor_causaloid` contains a simple `causal_fn` that checks if an incoming numerical value (the sensor reading) exceeds a predefined threshold (e.g., 65.0).
25+
26+
2. **States as `CausalState`s:**
27+
Each sensor in the system is represented by a `CausalState`. The `CausalState` struct holds a reference to the `Causaloid` that defines its logic. For instance, the `smoke_cs` holds the `smoke_sensor_causaloid`. When the CSM evaluates this state, it uses the causaloid to determine if the state is active.
28+
29+
3. **Actions as `CausalAction`s:**
30+
Each potential intervention is defined as a `CausalAction`. This struct wraps a function that will be executed when the action is fired. In this example, the actions (`get_smoke_alert_action`, `get_fire_alert_action`, etc.) simply print a message to the console, but they could just as easily trigger an API call, send an email, or control a physical device.
31+
32+
4. **The `CSM` as an Orchestrator:**
33+
The `CSM` is initialized with a collection of state-action pairs. Its primary role is to orchestrate the evaluation process. The `main` loop simulates a stream of sensor data. In each iteration:
34+
- A `PropagatingEffect::Numerical` is created from the raw sensor data.
35+
- `csm.eval_single_state()` is called for each sensor.
36+
- The CSM finds the corresponding `CausalState`, evaluates its `Causaloid` against the provided data, and if the result is `Deterministic(true)`, it automatically calls the `fire()` method on the associated `CausalAction`.
37+
38+
### Conclusion
39+
40+
This example demonstrates how the CSM acts as a powerful bridge between the abstract world of causal reasoning and the concrete world of action and intervention. By formally linking `CausalState`s (defined by `Causaloid`s) to `CausalAction`s, the EPP provides a robust, auditable, and deterministic way to build systems that not only understand cause and effect but can also act on that understanding.
41+
42+
43+
## Reference
44+
45+
For more information on the EPP, please see chapter 5 in the EPP document:
46+
https://github.com/deepcausality-rs/papers/blob/main/effect_propagation_process/epp.pdf
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.

examples/epp_dbn/README.md

Lines changed: 48 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,48 @@
1+
# EPP Example: Dynamic Bayesian Network (DBN)
2+
3+
This crate demonstrates how the `DeepCausality` library, which implements the Effect Propagation Process (EPP), can model a simple Dynamic Bayesian Network (DBN).
4+
5+
Specifically, this example models the classic "Umbrella World" scenario, where the decision to take an umbrella today depends on whether it is raining, and the probability of rain today depends on whether it rained yesterday.
6+
7+
This showcases how the EPP's architecture, with its first-class treatment of time and context, provides a natural and powerful framework for modeling temporal causal processes. This aligns with the principles outlined in Section 5.13 of the EPP documentation.
8+
9+
## How to Run
10+
11+
From within the `examples/epp_dbn` directory, run:
12+
13+
```bash
14+
cargo run --bin example-dbn
15+
```
16+
17+
---
18+
19+
### How It Works: Mapping DBN Concepts to EPP
20+
21+
A DBN models a temporal process by "unrolling" a causal graph over discrete time slices. The EPP achieves the same result by evaluating a single, static causal model over a dynamic, temporal context.
22+
23+
1. **Time Slices as a Dynamic Context:**
24+
Instead of creating new nodes for each time step (e.g., `Rain_t-1`, `Rain_t`), the EPP represents the entire timeline as a single, dynamic `Context`. This context holds `Datoid` nodes representing the state of variables (like `Rain`) at different points in time. As the simulation progresses, this context is updated, representing the forward flow of time.
25+
26+
2. **State Variables as Causaloids:**
27+
The state variables in the DBN (e.g., `Rain` and `Umbrella`) are represented as `Causaloid`s. Each `Causaloid` encapsulates the conditional probability table (CPT) for that variable as its `causal_fn`.
28+
- The `rain_causaloid` implements `P(Rain_t | Rain_t-1)`. It queries the context to find the state of rain on the previous day to determine the probability of rain today.
29+
- The `umbrella_causaloid` implements `P(Umbrella_t | Rain_t)`. It takes the probability of rain today as input and decides whether to take an umbrella.
30+
31+
3. **Dependencies as a CausaloidGraph:**
32+
The directed edges in the DBN, representing causal dependencies, are modeled as a `CausaloidGraph`. In this case, the graph represents the simple chain: `Rain -> Umbrella`.
33+
34+
4. **Inference as Evaluation over Time (Filtering):**
35+
The DBN's "filtering" process—updating the belief state as new evidence arrives—is modeled as a loop that simulates the passing of days. In each iteration:
36+
- The `rain_causaloid` is evaluated to determine the probability of rain for the current day.
37+
- A random sample is drawn to determine if it actually rained (simulating real-world observation).
38+
- The `umbrella_causaloid` is evaluated based on the probability of rain.
39+
- The `Context` is updated with the new state of rain, making it available for the next day's calculation.
40+
41+
### Conclusion
42+
43+
This example demonstrates that the EPP provides a flexible alternative to traditional DBNs. By externalizing time into a dynamic `Context`, the EPP allows for the modeling of complex temporal dependencies with a static and reusable causal graph. This separation of concerns simplifies the model and provides a clear and intuitive way to reason about causality in dynamic systems.
44+
45+
## Reference
46+
47+
For more information on the EPP, please see chapter 5 in the EPP document:
48+
https://github.com/deepcausality-rs/papers/blob/main/effect_propagation_process/epp.pdf

examples/epp_granger/README.md

Lines changed: 49 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,49 @@
1+
# EPP Example: Granger Causality
2+
3+
This crate demonstrates how the `DeepCausality` library, which implements the Effect Propagation Process (EPP), can model a Granger Causality test.
4+
5+
Specifically, this example answers the question: **"Do past changes in oil prices Granger-cause future changes in shipping activity?"**
6+
7+
This showcases how the EPP's architecture naturally handles the counterfactual reasoning inherent in a Granger test. This aligns with the principles outlined in Section 5.14 of the EPP documentation.
8+
9+
## How to Run
10+
11+
From within the `examples/epp_granger` directory, run:
12+
13+
```bash
14+
cargo run --bin example-granger
15+
```
16+
17+
---
18+
19+
### How It Works: Mapping Granger Causality to EPP Concepts
20+
21+
The core idea behind Granger Causality is to determine if one time series is useful in forecasting another. The EPP models this by comparing the predictive accuracy of a causal model under two different contexts: one with the complete history (factual) and one where the history of the potential causal variable has been removed (counterfactual).
22+
23+
1. **Causal Logic (`shipping_predictor_logic`):**
24+
The predictive model is encapsulated in a single, reusable `Causaloid`. This causaloid uses a `ContextualCausalFn`, a function that can inspect the context it's evaluated against. Its logic is to predict the next value of shipping activity based on the historical data it finds in its context. It will use both shipping and oil price data if available, but will gracefully fall back to using only shipping data if oil price data is missing.
25+
26+
2. **Factual vs. Counterfactual Contexts:**
27+
This is the key to the Granger test. We create two distinct realities:
28+
29+
* **Factual Context:** A `BaseContext` is created containing the complete, observed history of *both* oil prices and shipping activity.
30+
* **Counterfactual Context:** A second `BaseContext` is created that contains the history of shipping activity but is *missing* the history of oil prices.
31+
32+
3. **Evaluating Potential Outcomes:**
33+
We instantiate two separate `Causaloid`s, each with the same predictive logic but associated with a different context:
34+
35+
* The **factual causaloid** is evaluated against the factual context. Its prediction will be informed by the history of oil prices.
36+
* The **counterfactual causaloid** is evaluated against the counterfactual context. Its prediction will *not* be informed by the history of oil prices.
37+
38+
4. **Comparing Prediction Errors:**
39+
We compare the prediction error from both evaluations against a known, actual outcome. If the error from the factual evaluation (which included oil prices) is significantly lower than the error from the counterfactual evaluation, we can conclude that the oil price time series provides valuable information for predicting the shipping activity time series. In other words, oil prices Granger-cause shipping activity.
40+
41+
### Conclusion
42+
43+
This example highlights a core strength of the Effect Propagation Process: the explicit separation of **causal logic** (the `Causaloid`) from the **state of the world** (the `Context`). This separation makes it trivial to perform the powerful counterfactual reasoning required for a Granger Causality test by simply creating alternate contexts and evaluating the same immutable causal laws against them.
44+
45+
46+
## Reference
47+
48+
For more information on the EPP, please see chapter 5 in the EPP document:
49+
https://github.com/deepcausality-rs/papers/blob/main/effect_propagation_process/epp.pdf

0 commit comments

Comments
 (0)