Skip to content

Commit 3883712

Browse files
committed
better explanation of residuals plot
Signed-off-by: Nathaniel <[email protected]>
1 parent 43f4d4e commit 3883712

File tree

2 files changed

+6
-6
lines changed

2 files changed

+6
-6
lines changed

examples/case_studies/bayesian_sem_workflow.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@
6262
" - Each step asks: Does this addition honor theory? Improve fit?\n",
6363
" - Workflow = constant negotiation between parsimony and fidelity.\n",
6464
"\n",
65-
"These approaches complement one another. We'll see how the iterative and expansionary approach to model development is crucial for understanding the subtleties of SEM models. How our understanding grows as we track their implications across increasingly expressive candidate structures."
65+
"These approaches complement one another. We'll see how the iterative and expansionary approach to model development is crucial for understanding the subtleties of SEM models. How our understanding grows as we track their implications across increasingly expressive candidate model structures."
6666
]
6767
},
6868
{
@@ -366,7 +366,7 @@
366366
"\n",
367367
"_The Measurement Model_ is the factor-structure we seek to _confirm_ in our analysis. It is called a measurement model because we view the observable metrics as indicators of the thing we actually want to measure. The observable metrics are grouped under a unifying \"factor\" or construct. The idea that each of the indicators are imprecise gauges of the latent factor. The hope is that collectively they provide a better gauge of this hard to measure quantity e.g. satisfaction and well-being. This can be thought of as a data-reduction technique, where we reduce the complex multivariate data set to a smaller collection of inferred features. However, in most SEM applications the factors themselves are of independent interest, not merely a modelling convenience.\n",
368368
"\n",
369-
"In factor analysis we posit a factor-structure and estimate how each latent factor determines the observed metrics. The assumed data generating structure says that the factors cause the observed metrics.The inferential task works backwards, we want to infer the shape of the latent factors conditional on the observed metrics.\n",
369+
"In factor analysis we posit a factor-structure and estimate how each latent factor determines the observed metrics. The assumed data generating structure says that the factors cause the observed metrics. The inferential task works backwards, we want to infer the shape of the latent factors conditional on the observed metrics.\n",
370370
"\n",
371371
"$$ \\overbrace{y_i}^{indicators} = \\overbrace{\\Lambda \\eta_i}^{factors} + \\varepsilon_i, \n",
372372
"\\quad \\varepsilon_i \\sim \\mathcal N(0, \\Psi).\n",
@@ -1571,7 +1571,7 @@
15711571
"\n",
15721572
"Then we summarizes posterior estimates of model parameters (e.g factor loadings, regression coefficients, variances, etc.), providing a quick check against identification constraints (like fixed loadings) and effect directions. \n",
15731573
"\n",
1574-
"Finally we will plot the upper-triangle of the residual correlation matrix with a blue–white–red colormap (−1 to +1). This visualizes residual correlations among observed indicators after the SEM structure is accounted for — helping detect model misfit or unexplained associations."
1574+
"Finally we will plot the lower-triangle of the residual correlation matrix with a blue–white–red colormap (−1 to +1). This visualizes residuals of the model implied versus true correlations among observed indicators after the SEM structure is accounted for — helping detect model misfit or unexplained associations."
15751575
]
15761576
},
15771577
{

examples/case_studies/bayesian_sem_workflow.myst.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -61,7 +61,7 @@ The structure of the SEM workflow mirrors the Bayesian workflow closely. Each st
6161
- Each step asks: Does this addition honor theory? Improve fit?
6262
- Workflow = constant negotiation between parsimony and fidelity.
6363

64-
These approaches complement one another. We'll see how the iterative and expansionary approach to model development is crucial for understanding the subtleties of SEM models. How our understanding grows as we track their implications across increasingly expressive candidate structures.
64+
These approaches complement one another. We'll see how the iterative and expansionary approach to model development is crucial for understanding the subtleties of SEM models. How our understanding grows as we track their implications across increasingly expressive candidate model structures.
6565

6666
```{code-cell} ipython3
6767
import warnings
@@ -204,7 +204,7 @@ In our set up of a Structural Equation Model we have observed variables $y \in R
204204

205205
_The Measurement Model_ is the factor-structure we seek to _confirm_ in our analysis. It is called a measurement model because we view the observable metrics as indicators of the thing we actually want to measure. The observable metrics are grouped under a unifying "factor" or construct. The idea that each of the indicators are imprecise gauges of the latent factor. The hope is that collectively they provide a better gauge of this hard to measure quantity e.g. satisfaction and well-being. This can be thought of as a data-reduction technique, where we reduce the complex multivariate data set to a smaller collection of inferred features. However, in most SEM applications the factors themselves are of independent interest, not merely a modelling convenience.
206206

207-
In factor analysis we posit a factor-structure and estimate how each latent factor determines the observed metrics. The assumed data generating structure says that the factors cause the observed metrics.The inferential task works backwards, we want to infer the shape of the latent factors conditional on the observed metrics.
207+
In factor analysis we posit a factor-structure and estimate how each latent factor determines the observed metrics. The assumed data generating structure says that the factors cause the observed metrics. The inferential task works backwards, we want to infer the shape of the latent factors conditional on the observed metrics.
208208

209209
$$ \overbrace{y_i}^{indicators} = \overbrace{\Lambda \eta_i}^{factors} + \varepsilon_i,
210210
\quad \varepsilon_i \sim \mathcal N(0, \Psi).
@@ -534,7 +534,7 @@ Now for each latent variable (satisfaction, well being, constructive, dysfunctio
534534

535535
Then we summarizes posterior estimates of model parameters (e.g factor loadings, regression coefficients, variances, etc.), providing a quick check against identification constraints (like fixed loadings) and effect directions.
536536

537-
Finally we will plot the upper-triangle of the residual correlation matrix with a blue–white–red colormap (−1 to +1). This visualizes residual correlations among observed indicators after the SEM structure is accounted for — helping detect model misfit or unexplained associations.
537+
Finally we will plot the lower-triangle of the residual correlation matrix with a blue–white–red colormap (−1 to +1). This visualizes residuals of the model implied versus true correlations among observed indicators after the SEM structure is accounted for — helping detect model misfit or unexplained associations.
538538

539539
```{code-cell} ipython3
540540
:tags: [hide-input]

0 commit comments

Comments
 (0)