Skip to content

Commit 55805ae

Browse files
committed
Reduce paper length and align it to new guidelines
1 parent 3a8bd99 commit 55805ae

File tree

3 files changed

+23
-43
lines changed

3 files changed

+23
-43
lines changed

joss-paper/paper.md

Lines changed: 19 additions & 43 deletions
Original file line numberDiff line numberDiff line change
@@ -26,77 +26,53 @@ affiliations:
2626
index: 1
2727
- name: Neuroinformatics Unit, Sainsbury Wellcome Centre & Gatsby Computational Neuroscience Unit, University College London, London W1T 4JG, UK
2828
index: 2
29-
date: TODO
29+
date: 13 February 2026
3030
bibliography: paper.bib
3131
---
3232

33-
# Summary
3433

35-
Line-scanning microscopy, including multiphoton calcium imaging of neural populations in vivo, is a powerful technique for observing dynamic processes at cellular resolution. However, when the imaged sample rotates during acquisition (for example during horizontal passive rotation), the sequential line-by-line scanning process introduces geometric distortions. These artifacts, which manifest as shearing or curving of features, can severely compromise downstream analyses such as motion registration, cell detection, and signal extraction. While several studies have developed custom solutions for this issue [@velez-fort_circuit_2018], [@hennestad_mapping_2021], [@sit_coregistration_2023], [@voigts_somatic_2020], a general-purpose, accessible software package has been lacking.
34+
# Summary
3635

37-
`derotation` is an open-source Python package that algorithmically reconstructs image frames from data acquired during sample rotation around an axis orthogonal to the imaging plane. By leveraging recorded rotation angles and the microscope's line acquisition clock, the software applies a precise, line-by-line inverse transformation to restore the expected geometry of the imaged plane. This correction enables reliable cell segmentation during rapid rotational movements, making it possible to study yaw motion without sacrificing image quality (Figure 1).
36+
In line-scanning microscopy, sample rotation during acquisition introduces geometric distortions that compromise downstream analyses such as cell detection and signal extraction. While several groups have developed custom solutions [@velez-fort_circuit_2018; @hennestad_mapping_2021; @sit_coregistration_2023; @voigts_somatic_2020], a general-purpose tool has been lacking. `derotation` is an open-source Python package that corrects these artifacts by applying a line-by-line inverse rotation using recorded angles and the microscope's line acquisition clock, restoring the expected geometry and enabling reliable cell segmentation during rapid rotational movements (Figure 1).
3837

3938
![Example of `derotation` correction. On the left, the average of a series of images acquired using 3-photon microscopy of layer 6 mouse cortical neurons labeled with GCaMP7f during passive rotation. Center, the mean image after derotation, and on the left the mean image of the derotated movie after suite2p registration [@pachitariu_suite2p_2016]. As you can see, following derotation the cells are visible and have well defined shapes.](figure1.png)
4039

4140
# Statement of Need
4241

43-
Any imaging modality that acquires data sequentially, such as line-scanning microscopy, is susceptible to motion artifacts if the sample moves during the acquisition of a single frame. When this motion is rotational, it produces characteristic "fan-like" distortions that corrupt the morphological features of the imaged structures (Figure 2). This significantly complicates, or even prevents, critical downstream processing steps such as cell segmentation and automated region-of-interest tracking.
42+
When sample motion during acquisition is rotational, it produces characteristic "fan-like" distortions that corrupt the morphological features of the imaged structures (Figure 2), significantly complicating or preventing cell segmentation and automated region-of-interest tracking.
4443

45-
This problem is particularly acute in systems neuroscience, where researchers increasingly combine two-photon or three-photon calcium imaging with behavioral paradigms involving head rotation [@velez-fort_circuit_2018], [@hennestad_mapping_2021], [@sit_coregistration_2023], [@voigts_somatic_2020]. In such experiments, where head-fixed animals may be passively or actively rotated, high-speed angular motion can render imaging data unusable. The issue is even more problematic for imaging modalities with lower frame rates, such as three-photon calcium imaging. While individual labs have implemented custom scripts to address this, there remains no validated, open-source, and easy-to-use Python tool available to the broader community.
44+
This problem is particularly acute in systems neuroscience, where researchers increasingly combine multiphoton calcium imaging with behavioral paradigms involving head rotation [@velez-fort_circuit_2018; @hennestad_mapping_2021; @sit_coregistration_2023; @voigts_somatic_2020]. High-speed angular motion can render imaging data unusable, especially for modalities with lower frame rates such as three-photon imaging. Despite multiple lab-specific solutions, no validated, open-source, and easy-to-use Python tool has been available to the broader community.
4645

4746
![Schematic of line-scanning microscope distortion. Left: line scanning pattern plus sample rotation lead to fan-like artifacts when imaging a grid. Right: grid imaged while still (top), while rotating at 200°/s with 7Hz frame rate (middle), and after `derotation` (bottom), showing alignment restoration.](figure2.png)
4847

49-
`derotation` meets this need by providing a documented, tested, and modular solution for post hoc correction of imaging data acquired during rotation. It enables researchers to perform quantitative imaging during high-speed rotational movements. By providing a robust and accessible tool, `derotation` lowers the barrier for entry into complex behavioral experiments and improves the reproducibility of a key analysis step in a growing field of research.
50-
51-
# Functionality
52-
The core of the `derotation` package is a line-by-line affine transformation. It operates by first establishing a precise mapping between each scanned line in the movie and the rotation angle of the sample at that exact moment in time. It then applies an inverse rotation transform to each line around a specified or estimated center of rotation. Finally, the corrected lines are reassembled into frames, producing a movie that appears as if the sample had remained stationary.
53-
54-
## Modular Design
55-
`derotation` is designed with modularity in mind, catering to both novice users and advanced programmers.
56-
57-
### End to end Pipelines
58-
For ease of use, `derotation` provides two high-level processing workflows tailored to common experimental paradigms. The pipelines are designed for experimental setups with synchronized rotation and imaging data. The required inputs are:
48+
`derotation` meets this need by providing a documented, tested, and modular solution for post hoc correction of imaging data acquired during rotation. The package has been developed openly on GitHub since June 2023, with contributions from five developers. It is distributed on PyPI under a BSD-3-Clause license, with comprehensive documentation, tutorials, and runnable examples via Binder at https://derotation.neuroinformatics.dev. By providing a robust and accessible tool, `derotation` lowers the barrier for complex behavioral experiments and improves the reproducibility of a key analysis step in a growing field of research.
5949

60-
- Arrays of analog signals containing timing and rotation information, typically including the start of a new line, the start of a new frame, when the rotation system is active, the rotation position feedback;
61-
- A CSV file describing speeds and directions.
50+
The package has been validated on three-photon recordings of deep cortical neurons expressing the calcium indicator GCaMP7f in head-fixed mice (Figure 3). The corrected images showed restored cellular morphology and were successfully processed by Suite2p [@pachitariu_suite2p_2016]. Compared with frame-by-frame affine correction, line-by-line derotation preserves ROI fluorescence signals during rotation periods, eliminating the artificial dips visible in Figure 3.
6251

63-
- `FullPipeline` is engineered for experimental paradigms involving randomized, clockwise or counter-clockwise rotations. It assumes that there will be complete 360° rotations of the sample. As part of its workflow, it can optionally estimate the center of rotation automatically using Bayesian optimization, which minimizes residual motion in the corrected movie.
64-
65-
- `IncrementalPipeline` is optimized for stepwise, single-direction rotations. This rotation paradigm is useful for calibration of the luminance across rotation angles. It can also provide an alternative estimate of the center of rotation, fitting the trajectory of a cell across rotation angles.
52+
![Figure 3. Validation on 3-photon data. Left: mean image after line‑by‑line derotation. Red circle marks the ROI used for the plots on the right. Top right: sample $\Delta F/F_0$ timecourse for the selected ROI (pink = line‑by‑line derotation; gray = frame‑by‑frame affine correction; shaded vertical bars = rotation intervals). Bottom right: mean $\Delta F/F_0$ aligned to rotation periods for clockwise and counterclockwise rotations. Line‑by‑line derotation preserves the ROI signal during rotations and removes the artificial dips introduced by frame‑by‑frame correction. Clockwise and counterclockwise traces show a roughly mirror‑symmetric, angle‑dependent modulation of measured fluorescence with the frame-by-frame correction.](figure3.png)
6653

67-
Both pipelines are configurable via YAML files or Python dictionaries, promoting reproducible analysis by making it straightforward to document and re-apply the same parameters across multiple datasets.
6854

69-
Upon completion, a pipeline run generates a comprehensive set of outputs: the corrected movie, a CSV file with rotation angles and metadata for each frame, debugging plots, a text file containing the estimated optimal center of rotation, and log files with detailed processing information.
55+
# State of the Field
7056

71-
### Low-level core function
72-
Advanced users can bypass pipeline workflows and use the core line-by-line derotation function directly by providing the original movie and an array of rotation angles for each line.
57+
Several groups have developed rotation correction procedures as part of their experimental pipelines [@velez-fort_circuit_2018; @hennestad_mapping_2021; @sit_coregistration_2023], but each solution remains embedded in a lab-specific workflow, typically applying frame-level corrections in MATLAB. The closest prior work is @voigts_somatic_2020, who described a line-by-line derotation approach for two-photon imaging during free locomotion, but their code was not designed to generalize across setups with lower frame rates. General-purpose image registration tools such as Suite2p [@pachitariu_suite2p_2016] correct for translational motion but cannot handle within-frame rotational distortions. `derotation` fills this gap by integrating with the scientific Python ecosystem and producing output that can be directly fed into standard registration pipelines.
7358

74-
This modular design allows users with custom experimental setups to integrate the `derotation` algorithm into their own analysis scripts while still benefiting from the line-by-line derotation logic.
59+
# Software Design
7560

76-
## Validation
77-
### Using 3-photon imaging data from head-fixed mice
78-
The package's effectiveness has been validated on three-photon recordings of deep cortical neurons expressing the calcium indicator GCaMP7f in head-fixed mice (Figure 3). The corrected images showed restored cellular morphology and were successfully processed by standard downstream signal analysis pipelines such as Suite2p [@pachitariu_suite2p_2016]. In Figure 3, it is possible to compare the change in fluorescence ($\Delta F/F_0$) of two regions of interest (ROIs) in the case of line-by-line derotation (as implemented in `derotation`) and frame-by-frame derotation (using `scipy.ndimage.affine_transform`). The line-by-line derotation restores the $\Delta F/F_0$ to its original value during rotation times, reducing the dips into negative values that are present in the frame-by-frame derotation.
61+
`derotation` separates the core algorithm from experiment-specific logic through a layered, object-oriented design. The core function (`derotate_an_image_array_line_by_line`) takes a movie and a per-line angle array and returns the corrected stack, with no dependencies on configuration or I/O, so it can be used directly in any custom workflow.
7962

80-
![Figure 3. Validation on 3-photon data. Left: mean image after line‑by‑line derotation. Red circle marks the ROI used for the plots on the right. Top right: sample $\Delta F/F_0$ timecourse for the selected ROI (pink = line‑by‑line derotation; gray = frame‑by‑frame affine correction; shaded vertical bars = rotation intervals). Bottom right: mean $\Delta F/F_0$ aligned to rotation periods for clockwise and counterclockwise rotations. Line‑by‑line derotation preserves the ROI signal during rotations and removes the artificial dips introduced by frame‑by‑frame correction. Clockwise and counterclockwise traces show a roughly mirror‑symmetric, angle‑dependent modulation of measured fluorescence with the frame-by-frame correction.](figure3.png)
63+
On top of this core, two pipeline classes (`FullPipeline` and `IncrementalPipeline`) orchestrate the end-to-end processing: data loading, angle interpolation, optional center-of-rotation estimation, derotation, and output saving. Each processing step is implemented as an overridable method, so users can subclass either pipeline to adapt to new experimental setups or data formats. A simulation module generates synthetic test data for systematic regression testing. Full API documentation, configuration guides, and runnable examples are available at https://derotation.neuroinformatics.dev.
8164

82-
### Synthetic Data Generation
83-
`derotation` includes a synthetic data generator that can create challenging synthetic datasets with misaligned centers of rotation and out-of-plane rotations. This feature is particularly useful for validating the robustness of the `derotation` algorithm and for developing new features.
65+
# Research Impact Statement
8466

85-
The synthetic data can be generated using the following classes:
67+
`derotation` is in use at the Sainsbury Wellcome Centre to process three-photon calcium imaging data acquired during passive head rotation experiments, with its output fed into Suite2p [@pachitariu_suite2p_2016] for cell detection and signal extraction.
8668

87-
- `Rotator` class: Core class that applies line-by-line rotation to an image stack, simulating a rotating microscope.
88-
- `SyntheticData` class: Creates fake cell images, assigns rotation angles, and generates synthetic stacks leveraging the `Rotator` class. It is a complete synthetic dataset generator.
69+
# AI Usage Disclosure
8970

90-
## Documentation and Installation
91-
`derotation` is available on PyPI and can be installed with `pip install derotation`. It is distributed under a BSD-3-Clause license. Comprehensive documentation, tutorials, and example datasets are available at https://derotation.neuroinformatics.dev. Using Binder, users can run the software in a cloud-based environment with sample data without requiring any local installation. The code to reproduce the figures in this paper are available at https://github.com/neuroinformatics-unit/derotation/joss-paper. To download the example data used for the figures and the tutorials, please consult the README at https://github.com/neuroinformatics-unit/derotation/.
71+
GitHub Copilot was used for code autocompletion during development and to assist with drafting portions of this manuscript and the documentation. All AI-generated content was reviewed, tested, and validated by the authors, who carried out the algorithmic design, architectural decisions, and scientific validation.
9272

93-
# Future Directions
94-
Derotation is currently used to process 3-photon movies acquired during head rotation. Future directions include further automated pipelines for specific motorised stages and experimental paradigms.
73+
# Acknowledgements
9574

96-
# Methodological appendix
97-
The package has been directly tested on 3-photon imaging data obtained from cortical layer 6 callosal-projecting neurons expressing the calcium indicator GCaMP7f in the mouse visual cortex (see Figure 3). More specifically, wild type C57/B6 mice were injected with retro AAV-hSyn-Cre (1 × $10^{14}$ units per ml) in the left and with AAV2/1.syn.FLEX.GCaMP7f (1.8 × $10^{13}$ units per ml) in the right primary visual cortex. A cranial window was implanted over the right hemisphere, and then a headplate was cemented onto the skull of the animal. After 4 weeks of viral expression and recovery, animals were head-fixed on a rotation platform driven by a direct-drive motor (U-651, Physik Instrumente). 360 degree clockwise and counter-clockwise rotations with different peak speed profiles (50, 100, 150, 200 deg/s) were performed while imaging awake neuronal activity using a 25x objective (XLPLN25XWMP2, NA 1.05, 25, Olympus). Imaging was conducted at 7 Hz with 256x256 pixels. All experimental procedures were approved by the Sainsbury Wellcome Centre (SWC) Animal Welfare and Ethical Review Body (AWERB). For detailed description of the 3-photon power source and imaging please see [@cloves_vivo_2024].
75+
We thank Eivind Hennestad for initial project discussion. We thank Mateo Vélez-Fort and Chryssanthi Tsitoura for their assistance in building and testing the the three-photon imaging and rotation setup as well as for feedback on the package. We also thank Igor Tatarnikov for contributing to the development of the package and the whole Neuroinformatics Unit. This package was inspired by previous work on `derotation` as described in [@voigts_somatic_2020]. The authors are grateful to the support staff of the Neurobiological Research Facility at the Sainsbury Wellcome Centre (SWC). This research was funded by the Sainsbury Wellcome Centre core grant from the Gatsby Charitable Foundation (GAT3361) and Wellcome Trust (219627/Z/19/Z), a Wellcome Trust Investigator Award (214333/Z/18/Z) and Discovery Award (306384/Z/23/Z) to T.W.M. and by SWC core funding to the Neurobiological Research Facility. S.W. was funded by a Feodor-Lynen fellowship from the Alexander von Humboldt Foundation.
9876

99-
# Acknowledgements
100-
We thank Eivind Hennestad for initial project discussion. We thank Mateo Vélez-Fort and Chryssanthi Tsitoura for their assistance in building and testing the the three-photon imaging and rotation setup as well as for feedback on the package. We also thank Igor Tatarnikov for contributing to the development of the package and the whole Neuroinformatics Unit. This package was inspired by previous work on `derotation` as described in [@voigts_somatic_2020]. The authors are grateful to the support staff of the Neurobiological Research Facility at SWC. This research was funded by the Sainsbury Wellcome Centre core grant from the Gatsby Charitable Foundation (GAT3361) and Wellcome Trust (219627/Z/19/Z), a Wellcome Trust Investigator Award (214333/Z/18/Z) and Discovery Award (306384/Z/23/Z) to T.W.M. and by SWC core funding to the Neurobiological Research Facility. S.W. was funded by a Feodor-Lynen fellowship from the Alexander von Humboldt Foundation.
10177

10278
# References

joss-paper/paper.pdf

-11.9 KB
Binary file not shown.

0 commit comments

Comments
 (0)