Skip to content

Commit e7acc78

Browse files
committed
Merge branch 'dev_classification'
# Conflicts: # DeepLabStream.py # experiments/custom/experiments.py # experiments/custom/stimulus_process.py # experiments/custom/triggers.py # utils/configloader.py # utils/poser.py
2 parents 67266d6 + 7571b73 commit e7acc78

File tree

12 files changed

+3033
-15
lines changed

12 files changed

+3033
-15
lines changed

DeepLabStream.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -340,7 +340,7 @@ def get_pose_mp(input_q, output_q):
340340
scmap, locref, ANIMALS_NUMBER, config
341341
)
342342
# Use the line below to use raw DLC output rather then DLStream optimization
343-
# peaks = pose
343+
#peaks = pose
344344
if MODEL_ORIGIN == "MADLC":
345345
peaks = get_ma_pose(frame, config, sess, inputs, outputs)
346346
analysis_time = time.time() - start_time

Readme.md

Lines changed: 31 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ DeepLabStream is a python based multi-purpose tool that enables the realtime tra
1313
Our toolbox was orginally adapted from the previously published [DeepLabCut](https://github.com/AlexEMG/DeepLabCut) ([Mathis et al., 2018](https://www.nature.com/articles/s41593-018-0209-y)) and expanded on its core capabilities, but is now able to utilize a variety of different network architectures for online pose estimation
1414
([SLEAP](https://github.com/murthylab/sleap), [DLC-Live](https://github.com/DeepLabCut/DeepLabCut-live), [DeepPosekit's](https://github.com/jgraving/DeepPoseKit) StackedDenseNet, StackedHourGlass and [LEAP](https://github.com/murthylab/sleap)).
1515

16-
DeepLabStreams core feature is the utilization of real-time tracking to orchestrate closed-loop experiments. This can be achieved using any type of camera-based video stream (incl. multiple streams). It enables running experimental protocols that are dependent on a constant stream of bodypart positions and feedback activation of several input/output devices. It's capabilities range from simple region of interest (ROI) based triggers to headdirection or behavior dependent stimulation.
16+
DeepLabStreams core feature is the utilization of real-time tracking to orchestrate closed-loop experiments. This can be achieved using any type of camera-based video stream (incl. multiple streams). It enables running experimental protocols that are dependent on a constant stream of bodypart positions and feedback activation of several input/output devices. It's capabilities range from simple region of interest (ROI) based triggers to headdirection or behavior dependent stimulation, including online classification ([SiMBA](https://www.biorxiv.org/content/10.1101/2020.04.19.049452v2), [B-SOID](https://www.biorxiv.org/content/10.1101/770271v2)).
1717

1818
![DLS_Stim](docs/DLSSTim_example.gif)
1919

@@ -25,6 +25,14 @@ DeepLabStreams core feature is the utilization of real-time tracking to orchestr
2525

2626
## New features:
2727

28+
#### 03/2021: Online Behavior Classification using SiMBA and B-SOID:
29+
30+
- full integration of online classification of user-defined behavior using [SiMBA](https://github.com/sgoldenlab/simba) and [B-SOID](https://github.com/YttriLab/B-SOID).
31+
- SOCIAL CLASSIFICATION with SiMBA 14bp two animal classification (more to come!)
32+
- Unsupervised Classification with B-SOID
33+
- New wiki guide and example experiment to get started with online classification: [Advanced Behavior Classification](https://github.com/SchwarzNeuroconLab/DeepLabStream/wiki/Advanced-Behavior-Classification)
34+
- this version has new requirements (numba, pure, scikit-learn), so be sure to install them (e.g. `pip install -r requirements.txt`).
35+
2836
#### 02/2021: Multiple Animal Experiments (Pre-release): Full [SLEAP](https://github.com/murthylab/sleap) integration (Full release coming soon!)
2937

3038
- Updated [Installation](https://github.com/SchwarzNeuroconLab/DeepLabStream/wiki/Installation-&-Testing) (for SLEAP support)
@@ -33,7 +41,8 @@ DeepLabStreams core feature is the utilization of real-time tracking to orchestr
3341

3442
#### 01/2021: DLStream was published in [Communications Biology](https://www.nature.com/articles/s42003-021-01654-9)
3543

36-
#### 12/2021: New pose estimation model integration ([DLC-Live](https://github.com/DeepLabCut/DeepLabCut-live)) and pre-release of further integration ([DeepPosekit's](https://github.com/jgraving/DeepPoseKit) StackedDenseNet, StackedHourGlass and [LEAP](https://github.com/murthylab/sleap))
44+
#### 12/2021: New pose estimation model integration
45+
- ([DLC-Live](https://github.com/DeepLabCut/DeepLabCut-live)) and pre-release of further integration ([DeepPosekit's](https://github.com/jgraving/DeepPoseKit) StackedDenseNet, StackedHourGlass and [LEAP](https://github.com/murthylab/sleap))
3746

3847
## Quick Reference:
3948

@@ -131,7 +140,6 @@ If you encounter any issues or errors, you can check out the wiki article ([Help
131140

132141
If you use this code or data please cite:
133142

134-
135143
Schweihoff, J.F., Loshakov, M., Pavlova, I. et al. DeepLabStream enables closed-loop behavioral experiments using deep learning-based markerless, real-time posture detection.
136144

137145
Commun Biol 4, 130 (2021). https://doi.org/10.1038/s42003-021-01654-9
@@ -147,3 +155,23 @@ Developed by:
147155
- Matvey Loshakov, [email protected]
148156

149157
Corresponding Author: Martin Schwarz, [email protected]
158+
159+
## Other References
160+
161+
If you are using any of the following open-source code please cite them accordingly:
162+
163+
> Simple Behavioral Analysis (SimBA) – an open source toolkit for computer classification of complex social behaviors in experimental animals;
164+
Simon RO Nilsson, Nastacia L. Goodwin, Jia Jie Choong, Sophia Hwang, Hayden R Wright, Zane C Norville, Xiaoyu Tong, Dayu Lin, Brandon S. Bentzley, Neir Eshel, Ryan J McLaughlin, Sam A. Golden
165+
bioRxiv 2020.04.19.049452; doi: https://doi.org/10.1101/2020.04.19.049452
166+
167+
> B-SOiD: An Open Source Unsupervised Algorithm for Discovery of Spontaneous Behaviors;
168+
Alexander I. Hsu, Eric A. Yttri
169+
bioRxiv 770271; doi: https://doi.org/10.1101/770271
170+
171+
> SLEAP: Multi-animal pose tracking;
172+
Talmo D. Pereira, Nathaniel Tabris, Junyu Li, Shruthi Ravindranath, Eleni S. Papadoyannis, Z. Yan Wang, David M. Turner, Grace McKenzie-Smith, Sarah D. Kocher, Annegret L. Falkner, Joshua W. Shaevitz, Mala Murthy
173+
bioRxiv 2020.08.31.276246; doi: https://doi.org/10.1101/2020.08.31.276246
174+
175+
>Real-time, low-latency closed-loop feedback using markerless posture tracking;
176+
Gary A Kane, Gonçalo Lopes, Jonny L Saunders, Alexander Mathis, Mackenzie W Mathis;
177+
eLife 2020;9:e61909 doi: 10.7554/eLife.61909

convert_classifier.py

Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
import pickle
2+
import os
3+
from pure_sklearn.map import convert_estimator
4+
5+
6+
def load_classifier(path_to_sav):
7+
"""Load saved classifier"""
8+
file = open(path_to_sav, "rb")
9+
classifier = pickle.load(file)
10+
file.close()
11+
return classifier
12+
13+
14+
def convert_classifier(path):
15+
# convert to pure python estimator
16+
print("Loading classifier...")
17+
clf = load_classifier(path)
18+
dir_path = os.path.dirname(path)
19+
filename = os.path.basename(path)
20+
filename, _ = filename.split(".")
21+
clf_pure_predict = convert_estimator(clf)
22+
with open(dir_path + "/" + filename + "_pure.sav", "wb") as f:
23+
pickle.dump(clf_pure_predict, f)
24+
print(f"Converted Classifier {filename}")
25+
26+
27+
if __name__ == "__main__":
28+
path_to_classifier = "PATH_TO_CLASSIFIER"
29+
convert_classifier(path_to_classifier)

0 commit comments

Comments
 (0)