Skip to content

Commit 5040957

Browse files
committed
Getting ready for going public!
1 parent 4905e00 commit 5040957

File tree

3 files changed

+450
-2
lines changed

3 files changed

+450
-2
lines changed

Readme.md

Lines changed: 11 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,10 +5,14 @@ DeepLabStream is a python based multi-purpose tool that enables the realtime tra
55
Our toolbox is adapted from the previously published [DeepLabCut](https://github.com/AlexEMG/DeepLabCut) ([Mathis et al., 2018](https://www.nature.com/articles/s41593-018-0209-y)] and expands on its core capabilities.
66
DeepLabStreams core feature is the real-time analysis using any type of camera-based video stream (incl. multiple streams). Building onto that, we designed a full experimental closed-loop toolkit. It enables running experimental protocols that are dependent on a constant stream of bodypart positions and feedback activation of several input/output devices. It's capabilities range from simple region of interest (ROI) based triggers to headdirection or behavior dependent stimulation.
77

8+
89
### Quick Reference:
910

1011
### 1. [Installation](utils/Installation.md)
11-
12+
13+
### 2. [Introduction to experiments](docs/Introduction.md)
14+
15+
### 3. [Design your first experiment](docs/MyFirstExperiment.md)
1216

1317
### How does this work
1418

@@ -102,6 +106,8 @@ As a visual representation of this event, the border of the ROI will turn green.
102106

103107
All experimental output will be stored to a .csv file for easy postprocessing.
104108

109+
Look at the [Introduction to experiments](docs/Introduction.md) to get an idea how to design your own experiment in DeepLabStream.
110+
105111
### Known issues
106112

107113
#### Error when stopping the analysis:
@@ -125,9 +131,12 @@ The issue can be resolved by closing and opening the app. If not, manually kill
125131
```
126132
killall -9 python
127133
```
134+
## References:
128135

129-
## License
136+
If you use this code or data please cite [Schweihoff et al, 2019](https://doi.org/10.1101/2019.12.20.884478).
130137

138+
## License
139+
This project is licensed under the GNU Lesser General Public License v3.0. Note that the software is provided "as is", without warranty of any kind, express or implied.
131140
## Authors
132141
Lead Researcher: Jens Schweihoff, [email protected]
133142

docs/Introduction.md

Lines changed: 268 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,268 @@
1+
# Introduction
2+
3+
This short introductory tutorial is targeted at experimenters with intermediate or advanced Python skills and will not
4+
go into full details of class based programming or explain the underlying processing functions.
5+
6+
To design and successfully implement an experiment in DeepLabStream you need:
7+
8+
1. A clear idea of the design and necessary steps of the planned experiment
9+
2. Good understanding of the relationship between detected behavior and the desired closed loop event
10+
3. A network and system that can detect the behavior of choice and react within a time frame that your experiment demands
11+
12+
Let's do this step by step. We will take the example experiment included in DeepLabStream as a basis.
13+
14+
## The general structure
15+
16+
An experiment in DeepLabStream is made up by several interacting parts. If you are already familiar with the overall design of DeepLabStream you can skip this part.
17+
18+
### 1. Triggers:
19+
20+
A Trigger is an object that is specifically created to check whether a certain predefined condition is true in the current frame.
21+
It will be checked repetitively and returns either `True` or `False` depending if the condition was met. For this the position of a body part or the posture of an animal is compared each frame.
22+
23+
Let's take the RegionTrigger as an example:
24+
```
25+
class RegionTrigger:
26+
27+
def __init__(self, region_type: str, center: tuple, radius: float, bodyparts, debug: bool = False):
28+
29+
self._roi_type = region_type.lower()
30+
region_types = {'circle': EllipseROI, 'square': RectangleROI}
31+
self._region_of_interest = region_types[self._roi_type](center, radius, radius)
32+
self._bodyparts = bodyparts
33+
```
34+
When creating an experiment, we are creating an instance of the `RegionTrigger` class with the parameters `region_type`, `center`, `radius` and `bodyparts`.
35+
In this case we are creating one of two different ROI types depending on the `region_type` with a `center` and a `radius` (or width/length) in pixels.
36+
37+
As we will see later, each time DeepLabStream is analysing a frame and passes it to the `ExampleExperiment`, the `Trigger` is called by it's main function `check_skeleton`.
38+
Let's have a look at a simplified version of this function:
39+
```
40+
def check_skeleton(self, skeleton: dict):
41+
42+
bp_x, bp_y = skeleton[self._bodyparts]
43+
result = self._region_of_interest.check_point(bp_x, bp_y)
44+
45+
color = (0, 255, 0) if result else (0, 0, 255)
46+
47+
if self._roi_type == 'circle':
48+
response_body = {'plot': {'circle': dict(center=self._region_of_interest.get_center(),
49+
radius=int(self._region_of_interest.get_x_radius()),
50+
color=color)}}
51+
52+
response = (result, response_body)
53+
return response
54+
```
55+
Whenever `check_skeleton()` is called, it returns whether the bodypart within the skeleton dictionary that was defined by the `bodyparts` parameter is inside the ROI. It will return `True` or `False` and an additional component that is visualized on the stream.
56+
To simplify this even further let's assume a different trigger where I want to check whether my animal has crossed the middle of my arena `x_center` (from left to right) and is in the right side of my arena.
57+
It can be easily done with a simple `if` statement.
58+
```
59+
def check_skeleton(self, skeleton: dict):
60+
61+
bp_x, bp_y = skeleton[self._bodyparts]
62+
if bp_x > x_center:
63+
result = True
64+
else:
65+
result = False
66+
67+
return result
68+
```
69+
You can go as complex as complex as you want, taking multiple body parts or even other objects and their relation into account when designing a trigger. Try it yourself!
70+
71+
### 2. Stimulation:
72+
73+
Stimulations are a bit trickier to explain, because they heavily depend on your setup and your experiment. Let's go through some basics.
74+
75+
A stimulation is triggered and reacts to a given condition. That's easy, we know a simple way of doing that! `Trigger`s!
76+
77+
It runs in parallel with the experiment and does not stop or slow down the procedure. That's harder, depending on our stimulation we might be engaged for a longer time and this would halt the whole process.
78+
79+
Luckily DeepLabStream was designed to account for such things by using multiprocessing.
80+
We suggest that you leave the general design of experiments in place and adapt your stimulations accordingly as we designed them to run in parallel to the experiment even if the actual stimulation is a multistep process itself.
81+
We have two important parts here:
82+
83+
`stimulation.py` contains the actual stimulation. `show_visual_stim_img` for example creates a window and displays an image in it. In the `ExampleExperiment` this function is used to switch between background and stimulation images on a screen that is visible to the animal from inside the arena.
84+
`toggle_device` is a function that controls a device connected via a NI DAQ-board and sends a digital trigger (TTL) signal. It can be used to toggle lasers or any other device that can be connected and modulated through such a board. Most boards that are equipped with USB can be controlled through an API to interact with via Python.
85+
The rule of thumb here is: If you can control it with Python, DeepLabStream can control it.
86+
87+
`stimulation_process.py` is the protocol that orchestrates the stimulation in another process. It also contains `Timer` which can be very useful in many cases as we will see later.
88+
Let's have a look at the main function inside of the process `example_protocol_run`:
89+
```
90+
def example_protocol_run(condition_q: mp.Queue):
91+
current_trial = None
92+
dmod_device = DigitalModDevice('Dev1/PFI0')
93+
while True:
94+
if condition_q.full():
95+
current_trial = condition_q.get()
96+
if current_trial is not None:
97+
show_visual_stim_img(img_type=current_trial, name='inside')
98+
dmod_device.toggle()
99+
else:
100+
show_visual_stim_img(name='inside')
101+
dmod_device.turn_off()
102+
103+
if cv2.waitKey(1) & 0xFF == ord('q'):
104+
break
105+
```
106+
107+
To simplify the multiprocessing part assume that we have a connection (queue) between the DeepLabStream app (that analyses and displays the stream) and the experimental protocol (that controls the stimulation). This connection is very simple and we are passing a single argument. Whenever we tell the experimental protocol that a stimulation should be started (or trial) it passes this through the connection.
108+
```
109+
while True:
110+
if condition_q.full():
111+
current_trial = condition_q.get()
112+
```
113+
This literally says: Check if there is something waiting for you in the connection (queue) and take it.
114+
```
115+
if current_trial is not None:
116+
show_visual_stim_img(img_type=current_trial, name='inside')
117+
dmod_device.toggle()
118+
else:
119+
show_visual_stim_img(name='inside')
120+
dmod_device.turn_off()
121+
```
122+
Here we are checking whether the parameter taken out of the connection actually means something for our stimulation. In this simple example we are just checking if `current_trial` is something at all and then pass it to the before mentioned stimulation function and simultaneously activates a device via a digital trigger. This way the image will be decided in the stimulation function.
123+
But we also can decide directly inside this function.
124+
125+
For example:
126+
```
127+
if current_trial == 'Trial_1':
128+
show_visual_stim_img(img_type= 'Trial_1', name='inside')
129+
elif current_trial == 'Trial_2':
130+
dmod_device.toggle()
131+
else:
132+
show_visual_stim_img(name='inside')
133+
dmod_device.turn_off()
134+
```
135+
Now we are specifically looking for a string in `current_trial`. It will now either show an image 'Trial_1' or activate the device 'Trial_2'. If no argument was passed, it will just show a background image and deactivate the device. What we pass to the experimental protocol will be decided in the next part.
136+
137+
### 3. The experiment:
138+
139+
Now we are coming to the scaffold that holds everything together and makes sense out of it.
140+
141+
Let's go through this step by step again:
142+
```
143+
class ExampleExperiment:
144+
145+
def __init__(self):
146+
self.experiment_finished = False
147+
self._process = ExampleProtocolProcess()
148+
self._green_point = (550, 163)
149+
self._blue_point = (372, 163)
150+
self._radius = 40
151+
self._event = None
152+
self._current_trial = None
153+
self._trial_count = {trial: 0 for trial in self._trials}
154+
self._trial_timers = {trial: Timer(10) for trial in self._trials}
155+
self._exp_timer = Timer(600)
156+
157+
@property
158+
def _trials(self):
159+
"""
160+
Defining the trials
161+
"""
162+
green_roi = RegionTrigger('circle', self._green_point, self._radius * 2 + 7.5, 'neck')
163+
blue_roi = RegionTrigger('circle', self._blue_point, self._radius * 2 + 7.5, 'neck')
164+
trials = {'Greenbar_whiteback': dict(trigger=green_roi.check_skeleton,
165+
count=0),
166+
'Bluebar_whiteback': dict(trigger=blue_roi.check_skeleton,
167+
count=0)}
168+
return trials
169+
170+
```
171+
The class `ExampleExperiment` is initiated with several parameters, including the actual process that orchestrates stimulation `ExampleProtocolProcess`. Here you will set most experimental defined parameters.
172+
To directly build on the last part, we will first have a look at `_trials`. As you can see we are creating two things here. First we are initiating two different `RegionTrigger` (have a look at 1. Trigger if this is not telling you anything),
173+
second we are creating a dictionary which includes the trigger, a count and a key that refers to each trial. To jump a little bit a head: The key or name of the trial is actually passed to the `ExampleProtocolProcess` as we have seen in 3.
174+
175+
Okay, we now have successfully connected trial, stimulation and trigger, but an experiment is more than that. Now we come to the actual scaffold i was talking about:
176+
177+
The function `check_skeleton` (remember the trigger function with the same name!) is where the magic happens. This function will get every frame analyzed by DeepLabStream and the corresponding posture or "skeleton" of the animal.
178+
179+
Here is a simplified version:
180+
```
181+
def check_skeleton(self, frame, skeleton):
182+
183+
if not self.experiment_finished:
184+
result, response = False, None
185+
for trial in self._trials:
186+
# check for all trials if condition is met
187+
result, response = self._trials[trial]['trigger'](skeleton=skeleton)
188+
if result:
189+
if self._current_trial is None:
190+
self._current_trial = trial
191+
self._trial_count[trial] += 1
192+
print(trial, self._trial_count[trial])
193+
else:
194+
if self._current_trial == trial:
195+
self._current_trial = None
196+
197+
self._process.set_trial(self._current_trial)
198+
199+
```
200+
When the experiment is not finished, check for both trials (in this case the term trial is confusing, as it is actually just connected to a condition in this experiment) if the condition/trigger is met.
201+
If it was met, increase the trial counter by 1 and pass the trial name to the stimulation.
202+
203+
This simplified example would show images and trigger a device indefinitely as long as the animal is entering the defined ROIs. We are missing a crucial component that is part of any experiment: `Timer`!
204+
205+
### 4. Timer:
206+
207+
You probably spotted them already. We actually mentioned them earlier. But let's first look at the basic function and then their implementation.
208+
209+
A timer should be able to track time independent of the actual processing speed that the rest of the software is limited by.
210+
So it's quite simple, when we create a instance of the `Timer` class, we specify the time in seconds it should keep track of. Every time we check `Timer` it will tell us whether that time has run out or not. The rest of the functionality of this class is mainly utility wise.
211+
For example we can reset a `Timer` to start it again, without the need to create it anew.
212+
213+
Let's talk about implementations. At this points it's not very useful to talk about the code inside the experiment as they all follow the same easy principle. If you have a look into the actual `ExampleExperiment`, you will see several cases.
214+
215+
a. Total experimental time aka `exp_timer`. If this timer runs out, your total experimental time was reached and the experiment should stop itself. In almost all experiments this is a must, this timer will keep track of time for you and ends the experiment in a coordinated fashion automatically. This does not get the animal out of the arena though... sorry.
216+
217+
b. Inter trial time aka inter stimulation time. Here is where the reset comes in handy. When we want to have a time
218+
between each triggered stimulation or trial, just add one of these and reset them after each trial/event again. Don't forget to start them again though!
219+
220+
c. Stimulation time. Assuming that you want to stimulate your animal not only when the condition/trigger was met, but also for some time after. This timer is useful to turn off your stimulus again after that time passed. Depending on the experiment it might be used in the `Experiment` class but most likely you will implement it in `stimulus_process.py`.
221+
222+
There are several other possibilities to use the `Timer` but this covers the basics that are most likely in any experiment you will design. As they say: `Timer` is of the essence!
223+
224+
225+
## Design your own experiment
226+
227+
With the basics in place, you should be ready to start adapting the `ExampleExperiment` to fit your experimental needs. Rather then rewriting the entire code, we recommend taking the example and modify the predefined structure. If you want to design your own triggers, we recommend looking at your previous post hoc data analysis.
228+
If you worked with DeepLabCut in the past, you most likely have an idea how to find out if certain conditions were met by the animal during the offline experiment. Can you reduce it to an frame by frame `True` or `False` output?
229+
230+
Yes? Congratulations, you got your first `Trigger` candidate!
231+
232+
Look at the [MyFirstExperiment tutorial](MyFirstExperiment.md) to get an idea how to design your own experiment in DeepLabStream.
233+
234+
235+
### Testing experiments offline
236+
237+
To test your design, we recommend using `VideoAnalyzer.py`. It will give you an idea of the feasibility of your design.
238+
Within `/utils` we implemented an offline testing script `VideoAnalyzer.py` that enables experiment tests using prerecorded videos.
239+
240+
241+
As an example: You can run the `ExampleExperiment` on any video by simply inserting the path of the video into `settings.ini`:
242+
```
243+
[Video]
244+
VIDEO_SOURCE = FullVideoPath.avi
245+
```
246+
Then run `VideoAnalyzer.py` the same as you would run `app.py` (Note: `VideoAnalyzer.py` does not offer a GUI).
247+
248+
To test your own experiments, you have to import your custom `ExperimentClass` like this:
249+
```
250+
# add it to the import line
251+
from experiments.experiments import ExampleExperiment, YourExperiment
252+
```
253+
and change the following line to create an instance of that experiment:
254+
```
255+
# old line:
256+
experiment = ExampleExperiment()
257+
# new line:
258+
experiment = YourExperiment()
259+
```
260+
261+
If you want to run the DeepLabStream posture detection on a prerecorded video without running an experiment just set the `experiment_enabled` to `False`.
262+
A video of your offline test will be saved if `video_output` is set `True` (default = `True`). As usual all experimental data will be exported in a .csv file.
263+
264+
Note: `VideoAnalyzer.py` is not build to quickly analyze videos, but is specifically build to show the result immediately as a "live" stream.
265+
266+
## Concluding remarks
267+
268+
We did not cover all functions within the classes and parts we discussed, but most of them are commented extensively. Have a look at the script. Now that you know the basic principle, it should be much easier to understand.

0 commit comments

Comments
 (0)