Skip to content

Commit 9070a2e

Browse files
committed
Merge remote-tracking branch 'origin/master'
2 parents a780e40 + 595fe33 commit 9070a2e

File tree

1 file changed

+12
-52
lines changed

1 file changed

+12
-52
lines changed

Readme.md

Lines changed: 12 additions & 52 deletions
Original file line numberDiff line numberDiff line change
@@ -16,20 +16,25 @@ DeepLabStreams core feature is the real-time analysis using any type of camera-b
1616

1717
![DLS_Stim](docs/DLSSTim_example.gif)
1818

19-
### Quick Reference:
19+
## Quick Reference:
20+
21+
### Check out or wiki: [DLStream Wiki](https://github.com/SchwarzNeuroconLab/DeepLabStream/wiki)
2022

2123
### Read the preprint: [Schweihoff et al, 2019](https://doi.org/10.1101/2019.12.20.884478).
2224

2325
### 1. [Installation & Testing](https://github.com/SchwarzNeuroconLab/DeepLabStream/wiki/Installation-&-Testing)
2426

25-
### 2. [Introduction to experiments](https://github.com/SchwarzNeuroconLab/DeepLabStream/wiki/Introduction)
27+
### 2. [How to use DLStream GUI](https://github.com/SchwarzNeuroconLab/DeepLabStream/wiki/How-to-use-DLStream)
28+
29+
### 3. [Introduction to experiments](https://github.com/SchwarzNeuroconLab/DeepLabStream/wiki/Introduction)
2630

27-
### 3. [Design your first experiment](https://github.com/SchwarzNeuroconLab/DeepLabStream/wiki/My-first-experiment)
31+
### 4. [Design your first experiment](https://github.com/SchwarzNeuroconLab/DeepLabStream/wiki/My-first-experiment)
2832

29-
### 4. [Adapting an existing experiment to your own needs](https://github.com/SchwarzNeuroconLab/DeepLabStream/wiki/Adapting-an-existing-experiment-to-your-own-needs)
33+
### 5. [Adapting an existing experiment to your own needs](https://github.com/SchwarzNeuroconLab/DeepLabStream/wiki/Adapting-an-existing-experiment-to-your-own-needs)
3034

35+
#### Check out our [Out-of-the-Box](https://github.com/SchwarzNeuroconLab/DeepLabStream/wiki/Out-Of-The-Box:-What-Triggers-are-available%3F) section to get a good idea, what DLStream has in stock for your own experiments!
3136

32-
### How does this work
37+
## How does this work
3338

3439
DeepLabStream uses the camera's video stream to simultaneously record a raw (read as unmodified) video of the ongoing experiment,
3540
send frames one-by-one to the neuronal network for analysis, and use returned analysed data to plot and show a video stream for the experimenter to observe and control the experiment.
@@ -38,53 +43,6 @@ and to end, prolong or modify parts of experimental protocol.
3843

3944
![Flowchart](docs/flowchart2.png)
4045

41-
## Usage
42-
43-
### How to use DeepLabStream
44-
45-
Just run
46-
```
47-
cd DeepLabStream
48-
python app.py
49-
```
50-
51-
You will see the main control panel of a GUI app.
52-
53-
![Main](docs/screen_gui.png)
54-
55-
To start working with DeepLabStream, press the `Start Stream` button. It will activate the camera manager and show you the current view from the connected cameras.
56-
57-
![Stream](docs/screen_stream.png)
58-
59-
After that you can `Start Analysis` to start DeepLabCut and receive a pose estimations for each frame, or, additionally, you can `Start Recording` to record a
60-
video of the current feed (visible in the stream window). You will see your current video timestamp (counted in frames) and FPS after you pressed the `Start Analysis` button.
61-
62-
![Analisys](docs/screen_analysis.png)
63-
64-
As you can see, we track three points that represent three body parts of the mouse - nose, neck and tail root.
65-
Every single frame where the animal was tracked is outputted to the dataframe, which would create a .csv file after the analysis is finished.
66-
67-
After you finish with tracking and/or recording the video, you can stop either function by specifically pressing on corresponding "stop" button
68-
(so, `Stop Analysis` or `Stop Recording`) or you can stop the app and refresh all the timing at once, by pressing `Stop Streaming` button.
69-
70-
#### Experiments
71-
72-
DeepLabStream was build specifically for closed-loop experiments, so with a properly implemented experiment protocol, running experiments on this system is as easy as
73-
pressing the `Start Experiment` button. Depending on your protocol and experimental goals, experiments could run and finish without any further engagement from the user.
74-
75-
![Start](docs/screen_exp_start.png)
76-
77-
In the provided `ExampleExperiment` two regions of interest (ROIs) are created inside an arena. The experiment is designed to count the number of times the mouse enters a ROI and trigger a corresponding visual stimulus on a screen.
78-
The high contrast stimuli (image files) are located within the `experiments/src` folder and specified within the `experiments.py` `ExampleExperiments` Class.
79-
80-
![Experiment](docs/screen_exp.png)
81-
82-
As a visual representation of this event, the border of the ROI will turn green.
83-
84-
All experimental output will be stored to a .csv file for easy postprocessing.
85-
86-
Look at the [Introduction to experiments](https://github.com/SchwarzNeuroconLab/DeepLabStream/wiki/Introduction) to get an idea how to design your own experiment in DeepLabStream or learn how to adapt one of the already published experiments at [Adapting an existing experiment](https://github.com/SchwarzNeuroconLab/DeepLabStream/wiki/Adapting-an-existing-experiment-to-your-own-needs).
87-
8846
### Known issues
8947

9048
If you encounter any issues or errors, you can check out the wiki article ([Help there is an error!](https://github.com/SchwarzNeuroconLab/DeepLabStream/wiki/Help-there-is-an-error!)). If your issue is not listed yet, please refer to the issues and either submit a new issue or find a reported issue (which might be already solved) there. Thank you!
@@ -95,7 +53,9 @@ If you use this code or data please cite [Schweihoff et al, 2019](https://doi.or
9553

9654
## License
9755
This project is licensed under the GNU General Public License v3.0. Note that the software is provided "as is", without warranty of any kind, expressed or implied.
56+
9857
## Authors
58+
9959
Lead Researcher: Jens Schweihoff, [email protected]
10060

10161
Lead Developer: Matvey Loshakov, [email protected]

0 commit comments

Comments
 (0)