-
Notifications
You must be signed in to change notification settings - Fork 14
Open
Description
Summary
Long fiber photometry recordings (2-3 hours) have significant photobleaching that affects both the signal and isosbestic control channels. While the isosbestic subtraction step accounts for movement artifacts and much of the shared bleaching, a residual slow drift persists in the corrected (post-subtraction) dF/F signal. The lab requires an additional exponential detrending step applied to the corrected signal after isosbestic subtraction to eliminate this residual trend and produce a stationary baseline.
Motivation
- The Nelson lab routinely runs recordings of approximately 3 hours; slow photobleaching accumulates significantly over this duration
- Rodrigo Paz (Nelson lab) described this explicitly: the isosbestic correction "mainly corrects for movement artifacts... however, both of them have photobleaching over the course of long time, so at the end you end up with something that has an exponential decay, so we also correct for that decay"
- Critically, he noted: "we end up with the corrected signal also has like a slow photobleaching that we also correct" — meaning the residual drift persists even after the standard isosbestic subtraction step
- The motivation for eliminating this drift is analytical: the lab bins recordings into 2-minute segments and correlates per-bin fluorescence with behavioral scores across the session (see Issue 05). Alexandra put it directly: "if nothing was happening, for the signal to be dead flat for three hours, so that we can say, how does the velocity correlate during this two-minute bin here, and this two-minute bin here"
- Without detrending the corrected signal, any bin-to-bin comparison is confounded by the slow residual decay, not genuine biological signal changes
- From the meeting notes: photobleaching has two components — a fast early decay and a slow ongoing drift — suggesting a bi-exponential model may be needed for longer recordings
Proposed Solution
- Add an optional detrending step to Step 4 (
extractTsAndSignal), applied to the dF/F signal after isosbestic subtraction and before z-scoring - The detrending fits an exponential (or bi-exponential) model to the post-subtraction dF/F trace and subtracts the fitted trend, leaving only the residual biological signal
- Expose new parameters in
GuPPyParamtersUsed.json:photobleaching_detrend(bool, defaultfalse) — opt-in togglephotobleaching_model(str,"mono"or"bi", default"mono") — number of exponential components to fit
- Save the fitted trend curve as a diagnostic plot alongside existing preprocessing outputs so users can verify the correction
- Expose the toggle in the Step 4 GUI panel alongside the existing filter window and z-score method controls
Open Questions
- Should the detrending be applied to the dF/F signal or to the raw post-subtraction fluorescence before the dF/F normalization step? Applying it to dF/F is conceptually cleaner; applying it earlier may be more numerically stable
- Should the exponential fit be estimated over the entire recording, or per-chunk within the artifact removal windowing? A whole-recording fit is necessary to resolve slow time constants, but the chunking logic in
compute_z_scorewould need to be accounted for - When is a bi-exponential model necessary vs. a simpler mono-exponential? Rodrigo mentioned "a bi-exponential decay or yeah" tentatively, suggesting the fast and slow components may be recording-length-dependent — guidance from testing on real long data would inform the default
- Should the detrending be applied only after isosbestic subtraction (the typical path), or also available when running without an isosbestic control channel?
- Should the fitted trend be stored in the HDF5 output for diagnostic inspection, or only saved as a plot?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels