-
-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Description
I'll summarise here my work for GSoC 2025 and describe a few ideas I had throughout for future improvements.
My main goal (following Mike X Cohen's paper) was to implement an sklearn transformer for generalized eigendecomposition (GED) that would generalize algorithms like CSP
, xDAWN
, etc.
The transformer was implemented in #13259 supporting restriction/whitening for rank-deficient covariances (see the implementation details entry).
GED-based algorithms are spatial filters and essentially separate sources. The SpatialFilter
container for their (and LinearModel) visualisation was implemented in #13332 and currently supports scree plot and topomaps for filters and patterns (see, for example, xDAWN example).
- It would be useful to have a tutorial showing how to implement custom covariance estimation / eigenvalue sorting functions for
_GEDTransformer
and investigate the resulting sources withSpatialFilter
- After that
_GEDTransformer
could be made public
- After that
-
_GEDTransformer
could haveinverse_transform
similarly to howCSP
does, but generalize it to the "multi" decomposition as in theXdawnTransformer
case - Following ICA visualisation of spatially filtered time-series, it would be nice to have similar function for other spatial filters, but will require adding a new branch for
SpatialFilter
(or unifying it withICA
) inmne.viz._figure.BrowserBase
-
SpatialFilter
is intended for visualisation of multiple spatial filters fixed over time, but there are cases such as EMS, LinearModel on vectorized data, SlidingEstimator wrapping LinearModel (and potentiallyGeneralizingEstimator
) where each time point of an epoch can have different pattern. These can be conveniently visualised usingEvokedArray
and could be implemented either as a second use case forSpatialFilter
or in an another container inheriting fromEvokedArray
, for example -
mne.preprocessing.Xdawn
works withEpochs
and so can't directly inherit from _GEDTransformer, but perhaps _GEDTransformer's logic infit
andtransform
could be modularised and then reused inXdawn
. -
SlidingEstimator
andGeneralizingEstimator
currently apply wrapped classifier per time-point. This can be generalized to sliding windows, where search lights will pass the windows to the downstream pipeline to cover cases like:SlidingEstimator(make_pipeline(Vectorizer(), SVC()))
or, using pyRiemann transformers,SlidingEstimator(make_pipeline(XdawnTransformer(), Covariances(), TangentSpace(), SVC()))
for ERP decoding.
The second part of the GSoC was to make the decoding classes more compliant with new sklearn (1.6+) estimator checks and data validation. LinearModel
has been made a meta-estimator and reworked in #13361, while sklearn compliance for other classes was corrected in #13393.