diff --git a/docs/root/index.html b/docs/root/index.html index 86015297..cee11753 100644 --- a/docs/root/index.html +++ b/docs/root/index.html @@ -7,21 +7,21 @@ - Learnable latent embeddings for joint behavioural and neural analysis - + CEBRA + - + - + @@ -36,7 +36,6 @@ CEBRA @@ -93,58 +116,26 @@
-

Learnable latent embeddings for joint behavioural and neural analysis

-
- - -
-
-
- Steffen Schneider*
- EPFL & IMPRS-IS - - -
-
- Jin Hwa Lee*
- EPFL - -
-
- Mackenzie Mathis
- EPFL - - -
+

CEBRA: a self-supervised learning algorithm for obtaining interpretable, Consistent EmBeddings of high-dimensional Recordings using Auxiliary variables

-
-
-
+ - -
+ - -
- CEBRA is a machine-learning - method that can be used to - compress time series in a way - that reveals otherwise hidden - structures in the variability of - the data. It excels on behavioural - and neural data recorded - simultaneously, and it can - decode activity from the visual - cortex of the mouse brain to - reconstruct a viewed video. +
- +

Demo Applications

-

Application of CEBRA-Behavior to rat hippocampus data (Grosmark and Buzsáki, 2016), showing position/neural activity (left), overlayed with decoding obtained by CEBRA. The current point in embedding space is highlighted (right). CEBRA obtains a median absolute error of 5cm (total track length: 160cm; see pre-print for details). Video is played at 2x real-time speed.

+

Application of CEBRA-Behavior to rat hippocampus data (Grosmark and Buzsáki, 2016), showing position/neural activity (left), overlayed with decoding obtained by CEBRA. The current point in embedding space is highlighted (right). CEBRA obtains a median absolute error of 5cm (total track length: 160cm; see Schneider et al. 2023 for details). Video is played at 2x real-time speed.

+
+ +
+ +
+ +
+ +

Interactive visualization of the CEBRA embedding for the rat hippocampus data. This 3D plot shows how neural activity is mapped to a lower-dimensional space that correlates with the animal's position and movement direction. Open In Colaboratory

+
+
- + -

CEBRA applied to mouse primary visual cortex, collected at the Allen Institute (de Vries et al. 2020, Siegle et al. 2021). 2-photon and Neuropixels recordings are embedded with CEBRA using DINO frame features as labels. - The embedding is used to decode the video frames using a kNN decoder on the CEBRA-Behavior embedding from the test set.

+

CEBRA applied to mouse primary visual cortex, collected at the Allen Institute (de Vries et al. 2020, Siegle et al. 2021). 2-photon and Neuropixels recordings are embedded with CEBRA using DINO frame features as labels. + The embedding is used to decode the video frames using a kNN decoder on the CEBRA-Behavior embedding from the test set.

+
+ +
+ + +

CEBRA applied to M1 and S1 neural data, demonstrating how neural activity from primary motor and somatosensory cortices can be effectively embedded and analyzed. See DeWolf et al. 2024 for details.

+
+
+

Publications

+ +
+
+
Learnable latent embeddings for joint behavioural and neural analysis
+

Steffen Schneider*, Jin Hwa Lee*, Mackenzie Weygandt Mathis. Nature 2023

+

A comprehensive introduction to CEBRA, demonstrating its capabilities in joint behavioral and neural analysis across various datasets and species.

+ Read Paper + Preprint +
+
+ +
+
+
Time-series attribution maps with regularized contrastive learning
+

Steffen Schneider, Rodrigo González Laiz, Anastasiia Filipova, Markus Frey, Mackenzie Weygandt Mathis. AISTATS 2025

+

An extension of CEBRA that provides attribution maps for time-series data using regularized contrastive learning.

+ Read Paper + Preprint + NeurIPS-W 2023 Version +
+
+
+ +
+

Patent Information

+ +
+
+
Patent Pending
+

Please note EPFL has filed a patent titled "Dimensionality reduction of time-series data, and systems and devices that use the resultant embeddings" so if this does not work for your non-academic use case, please contact the Tech Transfer Office at EPFL.

+
+

- Abstract + Overview

@@ -209,31 +257,6 @@

-
-

- - Pre-Print -

-
- -
-

- The pre-print is available on arxiv at arxiv.org/abs/2204.00673. -

- -
-

@@ -244,8 +267,7 @@

You can find our official implementation of the CEBRA algorithm on GitHub: Watch and Star the repository to be notified of future updates and releases. - You can also follow us on Twitter or subscribe to our - mailing list for updates on the project. + You can also follow us on Twitter for updates on the project.

If you are interested in collaborations, please contact us via @@ -258,13 +280,13 @@

BibTeX

-

Please cite our paper as follows:

+

Please cite our papers as follows:

@article{schneider2023cebra,
-   author={Schneider, Steffen and Lee, Jin Hwa and Mathis, Mackenzie Weygandt},
+   author={Steffen Schneider and Jin Hwa Lee and Mackenzie Weygandt Mathis},
  title={Learnable latent embeddings for joint behavioural and neural analysis},
  journal={Nature},
  year={2023},
@@ -277,6 +299,58 @@

+ +
+
+ + @inproceedings{schneider2025timeseries,
+   title={Time-series attribution maps with regularized contrastive learning},
+   author={Steffen Schneider and Rodrigo Gonz{\'a}lez Laiz and Anastasiia Filippova and Markus Frey and Mackenzie Weygandt Mathis},
+   booktitle={The 28th International Conference on Artificial Intelligence and Statistics},
+   year={2025},
+   url={https://openreview.net/forum?id=aGrCXoTB4P}
+ } +
+
+
+ +
+

+ + Impact & Citations +

+
+ +
+

+ CEBRA has been cited in numerous high-impact publications across neuroscience, machine learning, and related fields. Our work has influenced research in neural decoding, brain-computer interfaces, computational neuroscience, and machine learning methods for time-series analysis. +

+ + + +
+
+

Our research has been cited in proceedings and journals including Nature Science ICML Nature Neuroscience ICML Neuron NeurIPS ICLR and others.

+
+
+
+ +
+
+ + MLAI Logo + +
+ © 2021 - present | EPFL Mathis Laboratory +
+
+
Webpage designed using Bootstrap 5 and Fontawesome 5.