When a given input contrast has a very low fraction of its total variance explained by the decoded contrast the residual is essentially a noisier version of the input.
In such cases using the residual contrast for downstream analyses (e.g queries against the contrast DB) is not useful.
In those cases should there be warning, either when decomposing the variance or when performing queries against the contrast DB using the residual contrast?
When a given input contrast has a very low fraction of its total variance explained by the decoded contrast the residual is essentially a noisier version of the input.
In such cases using the residual contrast for downstream analyses (e.g queries against the contrast DB) is not useful.
In those cases should there be warning, either when decomposing the variance or when performing queries against the contrast DB using the residual contrast?