You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+12-5Lines changed: 12 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,7 +9,7 @@
9
9
10
10
# movement
11
11
12
-
A Python toolbox for analysing body movements across space and time, to aid the study of animal behaviour in neuroscience.
12
+
A Python toolbox for analysing animal body movements across space and time.
13
13
14
14
15
15

@@ -27,10 +27,17 @@ conda activate movement-env
27
27
28
28
## Overview
29
29
30
-
Pose estimation tools, such as [DeepLabCut](https://www.mackenziemathislab.org/deeplabcut) and [SLEAP](https://sleap.ai/) are now commonplace when processing video data of animal behaviour. There is not yet a standardised, easy-to-use way to process the pose tracks produced from these software packages.
31
-
32
-
movement aims to provide a consistent modular interface to analyse pose tracks, allowing steps such as data cleaning, visualisation and motion quantification.
33
-
We aim to support a range of pose estimation packages, along with 2D or 3D tracking of single or multiple individuals.
30
+
Machine learning-based tools such as
31
+
[DeepLabCut](https://www.mackenziemathislab.org/deeplabcut) and
32
+
[SLEAP](https://sleap.ai/) have become commonplace for tracking the
33
+
movements of animals and their body parts in videos.
34
+
However, there is still a need for a standardized, easy-to-use method
35
+
to process the tracks generated by these tools.
36
+
37
+
`movement` aims to provide a consistent, modular interface for analyzing
38
+
motion tracks, enabling steps such as data cleaning, visualization,
39
+
and motion quantification. We aim to support a range of animal tracking
40
+
frameworks, as well as 2D or 3D tracking of single or multiple individuals.
34
41
35
42
Find out more on our [mission and scope](https://movement.neuroinformatics.dev/community/mission-scope.html) statement and our [roadmap](https://movement.neuroinformatics.dev/community/roadmaps.html).
Copy file name to clipboardExpand all lines: docs/source/index.md
+9-4Lines changed: 9 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,7 +1,7 @@
1
1
(target-movement)=
2
2
# movement
3
3
4
-
A Python toolbox for analysing body movements across space and time, to aid the study of animal behaviour in neuroscience.
4
+
A Python toolbox for analysing animal body movements across space and time.
5
5
6
6
::::{grid} 1 2 2 3
7
7
:gutter: 3
@@ -32,10 +32,15 @@ How to get in touch and contribute.
32
32
33
33
## Overview
34
34
35
-
Pose estimation tools, such as [DeepLabCut](dlc:) and [SLEAP](sleap:) are now commonplace when processing video data of animal behaviour. There is not yet a standardised, easy-to-use way to process the *pose tracks* produced from these software packages.
35
+
Machine learning-based tools such as [DeepLabCut](dlc:) and [SLEAP](sleap:)
36
+
have become commonplace for tracking the movements of animals and their body
37
+
parts in videos. However, there is still a need for a standardized, easy-to-use method
38
+
to process the tracks generated by these tools.
36
39
37
-
movement aims to provide a consistent modular interface to analyse pose tracks, allowing steps such as data cleaning, visualisation and motion quantification.
38
-
We aim to support a range of pose estimation packages, along with 2D or 3D tracking of single or multiple individuals.
40
+
``movement`` aims to provide a consistent, modular interface for analyzing
41
+
motion tracks, enabling steps such as data cleaning, visualization,
42
+
and motion quantification. We aim to support a range of animal tracking
43
+
frameworks, as well as 2D or 3D tracking of single or multiple individuals.
39
44
40
45
Find out more on our [mission and scope](target-mission) statement and our [roadmap](target-roadmaps).
0 commit comments