You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: Readme.md
+31-3Lines changed: 31 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,7 +13,7 @@ DeepLabStream is a python based multi-purpose tool that enables the realtime tra
13
13
Our toolbox was orginally adapted from the previously published [DeepLabCut](https://github.com/AlexEMG/DeepLabCut) ([Mathis et al., 2018](https://www.nature.com/articles/s41593-018-0209-y)) and expanded on its core capabilities, but is now able to utilize a variety of different network architectures for online pose estimation
14
14
([SLEAP](https://github.com/murthylab/sleap), [DLC-Live](https://github.com/DeepLabCut/DeepLabCut-live), [DeepPosekit's](https://github.com/jgraving/DeepPoseKit) StackedDenseNet, StackedHourGlass and [LEAP](https://github.com/murthylab/sleap)).
15
15
16
-
DeepLabStreams core feature is the utilization of real-time tracking to orchestrate closed-loop experiments. This can be achieved using any type of camera-based video stream (incl. multiple streams). It enables running experimental protocols that are dependent on a constant stream of bodypart positions and feedback activation of several input/output devices. It's capabilities range from simple region of interest (ROI) based triggers to headdirection or behavior dependent stimulation.
16
+
DeepLabStreams core feature is the utilization of real-time tracking to orchestrate closed-loop experiments. This can be achieved using any type of camera-based video stream (incl. multiple streams). It enables running experimental protocols that are dependent on a constant stream of bodypart positions and feedback activation of several input/output devices. It's capabilities range from simple region of interest (ROI) based triggers to headdirection or behavior dependent stimulation, including online classification ([SiMBA](https://www.biorxiv.org/content/10.1101/2020.04.19.049452v2), [B-SOID](https://www.biorxiv.org/content/10.1101/770271v2)).
17
17
18
18

19
19
@@ -25,6 +25,14 @@ DeepLabStreams core feature is the utilization of real-time tracking to orchestr
25
25
26
26
## New features:
27
27
28
+
#### 03/2021: Online Behavior Classification using SiMBA and B-SOID:
29
+
30
+
- full integration of online classification of user-defined behavior using [SiMBA](https://github.com/sgoldenlab/simba) and [B-SOID](https://github.com/YttriLab/B-SOID).
31
+
- SOCIAL CLASSIFICATION with SiMBA 14bp two animal classification (more to come!)
32
+
- Unsupervised Classification with B-SOID
33
+
- New wiki guide and example experiment to get started with online classification: [Advanced Behavior Classification](https://github.com/SchwarzNeuroconLab/DeepLabStream/wiki/Advanced-Behavior-Classification)
34
+
- this version has new requirements (numba, pure, scikit-learn), so be sure to install them (e.g. `pip install -r requirements.txt`).
@@ -33,7 +41,8 @@ DeepLabStreams core feature is the utilization of real-time tracking to orchestr
33
41
34
42
#### 01/2021: DLStream was published in [Communications Biology](https://www.nature.com/articles/s42003-021-01654-9)
35
43
36
-
#### 12/2021: New pose estimation model integration ([DLC-Live](https://github.com/DeepLabCut/DeepLabCut-live)) and pre-release of further integration ([DeepPosekit's](https://github.com/jgraving/DeepPoseKit) StackedDenseNet, StackedHourGlass and [LEAP](https://github.com/murthylab/sleap))
44
+
#### 12/2021: New pose estimation model integration
45
+
- ([DLC-Live](https://github.com/DeepLabCut/DeepLabCut-live)) and pre-release of further integration ([DeepPosekit's](https://github.com/jgraving/DeepPoseKit) StackedDenseNet, StackedHourGlass and [LEAP](https://github.com/murthylab/sleap))
37
46
38
47
## Quick Reference:
39
48
@@ -131,7 +140,6 @@ If you encounter any issues or errors, you can check out the wiki article ([Help
131
140
132
141
If you use this code or data please cite:
133
142
134
-
135
143
Schweihoff, J.F., Loshakov, M., Pavlova, I. et al. DeepLabStream enables closed-loop behavioral experiments using deep learning-based markerless, real-time posture detection.
If you are using any of the following open-source code please cite them accordingly:
162
+
163
+
> Simple Behavioral Analysis (SimBA) – an open source toolkit for computer classification of complex social behaviors in experimental animals;
164
+
Simon RO Nilsson, Nastacia L. Goodwin, Jia Jie Choong, Sophia Hwang, Hayden R Wright, Zane C Norville, Xiaoyu Tong, Dayu Lin, Brandon S. Bentzley, Neir Eshel, Ryan J McLaughlin, Sam A. Golden
Talmo D. Pereira, Nathaniel Tabris, Junyu Li, Shruthi Ravindranath, Eleni S. Papadoyannis, Z. Yan Wang, David M. Turner, Grace McKenzie-Smith, Sarah D. Kocher, Annegret L. Falkner, Joshua W. Shaevitz, Mala Murthy
0 commit comments