Commit 0283d8d
feat: achieve 1:1 scientific rigor parity with main.py across all layers
Inference (MCP + Python API):
- Replace weak MCP confounder detection (adj==1 only) with offline.identify_confounders (1/3/4 confirmed, 2 potential)
- Replace naive MCP auto-select (binary→matching) with offline.select_estimation_method (full decision tree)
- Pass is_linear/treatment_kind to DML/DRL for proper variant selection (LinearDML/SparseLinearDML/CausalForestDML)
- Select MetaLearner variant based on linearity (TLearner for linear, XLearner for nonlinear)
- Add SHAP-benchmarked partial-R2 sensitivity analysis to MCP refute_estimate
- Add treatment_kind and algo to MCP result provenance
Discovery (Python API):
- Replace dropna() with stat_info_collection() (MICE imputation, label encoding, z-score normalization)
- Replace heuristic stat tests with Ramsey RESET (linearity) and Shapiro-Wilk (gaussianity)
- Add LLM Filter+Reranker algorithm selection with rule-based fallback
- Add LLM HyperparameterSelector with defaults fallback
- Add Judge postprocessing (bootstrap stability + KCI pruning + LLM refinement)
- Store original-scale data for estimation (not normalized), matching main.py Analysis class
Discovery (MCP):
- Remove TS guard on Judge postprocessing — now runs for ALL data including time-series
All changes fall back gracefully if pipeline modules unavailable.
422 passed, 10 skipped, 0 failed.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>1 parent 5af2d98 commit 0283d8d
File tree
7 files changed
+1993
-112
lines changed- causal_copilot
- mcp
- tests
7 files changed
+1993
-112
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
5 | 5 | | |
6 | 6 | | |
7 | 7 | | |
| 8 | + | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
92 | 92 | | |
93 | 93 | | |
94 | 94 | | |
95 | | - | |
| 95 | + | |
96 | 96 | | |
97 | 97 | | |
98 | 98 | | |
| |||
108 | 108 | | |
109 | 109 | | |
110 | 110 | | |
| 111 | + | |
| 112 | + | |
| 113 | + | |
| 114 | + | |
| 115 | + | |
| 116 | + | |
| 117 | + | |
| 118 | + | |
| 119 | + | |
| 120 | + | |
| 121 | + | |
| 122 | + | |
111 | 123 | | |
112 | 124 | | |
113 | 125 | | |
114 | 126 | | |
115 | 127 | | |
116 | 128 | | |
| 129 | + | |
| 130 | + | |
| 131 | + | |
| 132 | + | |
| 133 | + | |
| 134 | + | |
117 | 135 | | |
118 | 136 | | |
119 | 137 | | |
| |||
307 | 325 | | |
308 | 326 | | |
309 | 327 | | |
| 328 | + | |
| 329 | + | |
| 330 | + | |
| 331 | + | |
| 332 | + | |
310 | 333 | | |
311 | 334 | | |
312 | 335 | | |
| |||
0 commit comments