You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: code/RealSense/README-RealSense.md
+7-5Lines changed: 7 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -22,9 +22,9 @@ Respiration, heartbeats and/or other movements of a person’s body cause additi
22
22
23
23
As we have discussed earlier [10], application of the ‘artificial light texture’ to the person’s body and/or to the objects surrounding the person leads to significant (can be orders of magnitude) enhancement of the variations in the “Manhattan distance” between the consecutive video frames which (the variations) are associated with the respiration and/or heartbeats of the person compared to the case when the ‘artificial light texture’ is absent (e.g. when the ALT-generating light emitter is switched off) and otherwise identical data collection and data processing steps are performed.
24
24
25
-
Provided that video frames are captured at equal time intervals, the calculated SAD values can be viewed as the integral (the sum of) rate of change of the numeric values which are associated with the video frame pixels and used in SAD values calculations.
25
+
Provided that video frames are captured at equal time intervals, the calculated SAD values can be viewed as the integral (the sum of) rate of change of the numeric values which are associated with the video frame pixels and used in the SAD values calculations.
26
26
27
-
The code snippets below illustrate how SAD values can be calculated for the infrared (IR) channel frames captured by R200 and F200 devices.
27
+
The code snippets below illustrate how SAD values can be calculated for the infrared (IR) channel frames captured by R200 and F200 devices according to the implementation description above.
28
28
29
29
R200:
30
30
```C#
@@ -40,6 +40,8 @@ F200:
40
40
41
41
The raw ALT data are shown in the figures below by black line connecting the data points (the calculated SAD values for the captured video frames shown by black dots). The orange and blue lines are 0.2 s and 1 s long (in the equivalent number of data samples) moving averages of the raw data shown to highlight the heartbeat and respiration processes captured in the ALT data, respectively. Numeric values for the heart rate and respiration rate can be obtained, for example, via Fourier data analysis [14].
42
42
43
+
Note that, similarly to the “Pi Camera + Kinect” ALT system [10], the distance between the F200 or R200 camera and the person can affect how pronounced the heartbeat signal will be during the respiration events. Generally, the closer the camera gets to the person (after passing a certain distance point) the less pronounced the heartbeat signal component in the ALT data becomes during respiration events. Note also that at a large enough distance between the camera and the person there will be virtually no discernable pulse or respiration signal in the ALT data. Adjustments of the camera’s position can be made, for example, based on observing visualizations of the collected ALT data.
44
+
43
45
Snapshots of the scene captured by the cameras are shown to the left of the corresponding data plots.
44
46
45
47

@@ -52,11 +54,11 @@ Figure 2. ALT data obtained using light emitter and IR video stream of a F200 In
52
54
53
55
In the case of the dynamically projected patterns, as we have shown above on the example of a F200 Intel RealSence device, body movements, including the ones associated with heartbeats and respiration, lead to the changes in the non-uniform illumination distribution of the scene created by the light emitter of the F200 device, as captured by the infrared camera of the F200 device (the captured non-uniform illumination distribution forms the ‘artificial light texture’), which otherwise would have been absent provided the absence of any motion in the scene.
54
56
55
-
Note that, similarly to the “Pi Camera + Kinect” ALT system [10], the distance between the F200 or R200 camera and the person can affect how pronounced the heartbeat signal will be during the respiration events. Generally, the closer the camera gets to the person (after passing a certain distance point) the less pronounced the heartbeat signal component in the ALT data becomes during respiration events. Note also that at a large enough distance between the camera and the person there will be virtually no discernable pulse or respiration signal in the ALT data. Adjustments of the camera’s position can be made, for example, based on observing visualizations of the collected ALT data.
56
-
57
57

58
58
59
-
Figure 3. ALT data obtained using light emitter of a R200 Intel RealSense camera and video stream of a Raspberry Pi NoIR camera. Snapshot of the scene taken from the Pi NoIR camera’s video stream is shown on the left. A person is resting in an armchair at about 3 feet distance from the Pi NoIR and R200 cameras. R200 camera’s emitter provided most of the illumination for the scene. We have used the code [12] from our previous “Pi Camera + Kinect” example [10] to generate ALT data shown in this Figure.
59
+
Figure 3. ALT data obtained using light emitter of a R200 Intel RealSense camera and video stream of a Raspberry Pi NoIR camera. Snapshot of the scene taken from the Pi NoIR camera’s video stream is shown on the left. A person is resting in an armchair at about 3 feet distance from the Pi NoIR and R200 cameras. R200 camera’s emitter provided most of the illumination for the scene. We have used the code [12] from our previous “Pi Camera + Kinect” example [10] to generate the ALT data shown in this Figure.
60
+
61
+
As ALT implementations described above and before [10] demonstrate, ALT technology does not rely on any particular kind of light pattern (statically and/or dynamically projected). ALT does not use depth information encoded in the light patterns projected by the depth sensing devices such as Kinect and RealSense cameras.
0 commit comments