Skip to content

Add example for annotating behavioural events with BORIS#865

Open
HollyMorley wants to merge 10 commits intoneuroinformatics-unit:mainfrom
HollyMorley:add-boris-example
Open

Add example for annotating behavioural events with BORIS#865
HollyMorley wants to merge 10 commits intoneuroinformatics-unit:mainfrom
HollyMorley:add-boris-example

Conversation

@HollyMorley
Copy link
Contributor

@HollyMorley HollyMorley commented Mar 3, 2026

Description

What is this PR

  • Bug fix
  • Addition of a new feature
  • Other

Why is this PR needed?
This PR adds an example demonstrating how one could generate and integrate per-timepoint behavioural event labels alongside pose tracking data in movement. Event labels are a common component in behavioural analyses, whether derived from manual annotation tools such as BORIS (Behavioural Observation Research Interactive Software) or automated classifiers. Gait phase is a good example of such a label, as identifying stance and swing phases on a per-frame and per-limb basis is required for computing typical stride parameters such as stride length, duty factor, and cadence, or for analysing full-body dynamics that are inherently modulated by gait phase.

What does this PR do?
Adds a new example to the gallery using a 3D locomoting mouse dataset (DLC_single-mouse_DBTravelator_3D).

The example is named "Annotate and load events with BORIS" and demonstrates how to:

  1. Annotate gait phase events (stance, swing, unknown) for four limbs of a mouse in BORIS, with step-by-step instructions including screenshots.
  2. Load and process the exported BORIS .csv file and attach the resulting gait phase labels as non-dimension coordinates on the time dimension.
  3. Select and plot data subsets by gait phase using ds.sel() with boolean masks, including combining conditions across multiple limbs and multiple phases simultaneously.
  4. Build on the gait phase labels to assign stride indices to each frame, again as a non-dimension coordinate on the time dimension.

Note:

  • The BORIS annotation CSV is currently embedded inline (with only the relevant columns) and written to a temporary file at runtime.
  • The associated 2D dataset (DLC_single-mouse_DBTravelator_2D) is loaded solely to download the video file required for annotation in BORIS. ***EDIT: update alters this to download only the video file using pooch.

References

Relates to #821, which adds the "Annotate time with events of interest" example demonstrating the non-dimension coordinate pattern that this PR builds on.

How has this PR been tested?

This example has been run locally.

Is this a breaking change?

No.

Does this PR require an update to the documentation?

It is an update to the documentation.

Checklist:

  • The code has been tested locally
  • Tests have been added to cover all new functionality
  • The documentation has been updated to reflect any changes
  • The code has been formatted with pre-commit

@codecov
Copy link

codecov bot commented Mar 3, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 100.00%. Comparing base (1b9d294) to head (df4da74).
⚠️ Report is 9 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff            @@
##              main      #865   +/-   ##
=========================================
  Coverage   100.00%   100.00%           
=========================================
  Files           36        38    +2     
  Lines         2205      2284   +79     
=========================================
+ Hits          2205      2284   +79     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@sonarqubecloud
Copy link

@HollyMorley
Copy link
Contributor Author

Hi @niksirbi and @sfmig.
I did some investigation into the BORIS source code to better understand the allocation of stop frame indices following our discussion. Here is what I have found/confirmed:

  1. Mutually exclusive STOP times are hardcoded as next event time - 0.001 s, independent of video length. See write_event() in write_event.py.
  2. "Fix unpaired events", when run at time T, passes T - 0.001 s to fix_unpaired_state_events(). Each unpaired behaviour is then closed at max(T - 0.001, previous closing events) + 0.001, so for four limbs the STOP events are T, T + 0.001, T + 0.002, and T + 0.003 s. This ensures unique timestamps. See fix_unpaired_events() in state_events.py and fix_unpaired_state_events() in project_functions.py.
  3. "Add frame indexes" seeks the media player to each event timestamp and reads back the current frame index as estimated_frame_number. See add_frame_indexes() in event_operations.py and get_frame_index() in core.py.

I have implemented the Observations > Add frame indexes step in the example and it does work as expected on this dataset. However, I still have some uncertainty on exactly how frame indices are assigned - it seems this is delegated to the media player but my understanding is limited beyond this! Given the automatically generated frame indices were equal to my original approach of round((df["Stop (s)"] + 0.001) * fps).astype(int), and upon some manual inspection, I would guess it is rounding to the nearest frame (with this video at least). Perhaps we need a warning?

Let me know your thoughts! :)

@HollyMorley HollyMorley marked this pull request as ready for review March 10, 2026 16:50
@niksirbi niksirbi requested review from Copilot and niksirbi and removed request for Copilot March 10, 2026 17:49
@niksirbi
Copy link
Member

Thanks for the example, and for doing the digging into the frame issue @HollyMorley!

I've assigned this PR to myself for review, and I'll come back with comments when I've formed a full picture.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants