Skip to content

Commit 20f42c9

Browse files
authored
Create README-RealSense.md
1 parent 1f59a7e commit 20f42c9

File tree

1 file changed

+97
-0
lines changed

1 file changed

+97
-0
lines changed

code/RealSense/README-RealSense.md

Lines changed: 97 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,97 @@
1+
##ALT pulse and respiration monitoring using Intel RealSense cameras.
2+
3+
###*An ALTernative (to depth sensing and 3D object scanning) pulse and respiration monitoring application for Intel RealSense cameras.*
4+
5+
Intel RealSense technology [1] enables depth perception, 3D imaging, and feature tracking for virtual reality, robotics, and other applications. Light emitters of RealSense cameras can be used [2] as light source elements for the artificial light texture (ALT) technology [2, 3, 4, 5, 6]. ALT allows obtaining heart rate and respiration rate data for a person in a non-contact fashion. Skin does not need to be exposed for ALT to work [2-6]. ALT does not use depth information [2-6].
6+
7+
ALT can use RealSense cameras themselves or use another camera such as, for example, a Raspberry Pi NoIR camera [7] for video stream capture [2-6]. The figures below show ALT data obtained using infrared (IR) video streams of Intel RealSense R200 [8] and F200 [9] cameras, and video stream of a Raspberry Pi NoIR camera. R200 and F200 cameras used their own light projectors for ALT, and R200 camera’s projector was used with the Pi NoIR camera.
8+
9+
ALT can work with different types of static light patterns generated by various devices such as Microsoft Kinect [2-4, 10] and Intel RealSense R200 [5; also see below]. ALT can also work with dynamically projected patterns, as the F200 example below demonstrates [also see reference 5].
10+
11+
As we have discussed before, one of the possible implementations of the ALT technology [10] includes obtaining ‘Sum of Absolute Differences’ (SAD) [11] values generated by a video encoder for the video frames captured by a video camera [12].
12+
13+
Alternatively to using a video encoder data, calculation of the SAD values can be incorporated in the video data processing part of the ALT software in other ways, as we discuss below and show in the code snippets for RealSense cameras.
14+
15+
As a possible implementation, SAD-generating code can include iterating over pixels of a given captured video frame, for each pixel of the video frame calculating a difference between a numeric value of a certain kind associated with that pixel in the video frame data (e.g. the value corresponding to the pixel’s grayscale level) and numeric value of the same kind associated with the corresponding pixel of another captured video frame (for example, two pixels belonging to different video frames can be designated as corresponding to one another if pixel row and pixel column numbers for these pixels are the same), calculating absolute value of the found difference, and adding the calculated absolute value to the sum of the absolute values calculated on the previous step of the iteration process. The sum of the absolute difference values (the SAD value) thus calculated for a given video frame is analogous to the sSAD value obtained from the data generated by a video encoder as we have discussed earlier [10, 12].
16+
17+
The SAD value calculated for a captured video frame is a simple metric of the similarity between that video frame and another video frame (called the ‘reference’ video frame) which data were used in the calculation of the SAD value. The SAD value is the “Manhattan distance” [13] between the two video frames calculated using numeric values associated with the video frame pixels.
18+
19+
Note that for any captured video frame we typically use the one immediately preceding it as the ‘reference’ video frame in the SAD values calculations.
20+
21+
Respiration, heartbeats and/or other movements of a person’s body cause additional variations of the “Manhattan distance” between the consecutive video frames compared to the case when there are no body movements (and/or body -caused movements) in the scene. Thus, the calculated SAD values contain information about the respiration and/or heartbeats and/or other movements of a person over the time period covered by the captured video frames.
22+
23+
As we have discussed earlier [10], application of the ‘artificial light texture’ to the person’s body and/or to the objects surrounding the person leads to significant (can be orders of magnitude) enhancement of the variations in the “Manhattan distance” between the consecutive video frames which (the variations) are associated with the respiration and/or heartbeats of the person compared to the case when the ‘artificial light texture’ is absent (e.g. when the ALT-generating light emitter is switched off) and otherwise identical data collection and data processing steps are performed.
24+
25+
Provided that video frames are captured at equal time intervals, the calculated SAD values can be viewed as the integral (the sum of) rate of change of the numeric values which are associated with the video frame pixels and used in SAD values calculations.
26+
27+
The code snippets below illustrate how SAD values can be calculated for the infrared (IR) channel frames captured by R200 and F200 devices.
28+
29+
R200:
30+
```C#
31+
//To do
32+
//See the implementation description above
33+
```
34+
35+
F200:
36+
```C#
37+
//To do
38+
//See the implementation description above
39+
```
40+
41+
The raw ALT data are shown in the figures below by black line connecting the data points (the calculated SAD values for the captured video frames shown by black dots). The orange and blue lines are 0.2 s and 1 s long (in the equivalent number of data samples) moving averages of the raw data shown to highlight the heartbeat and respiration processes captured in the ALT data, respectively. Numeric values for the heart rate and respiration rate can be obtained, for example, via Fourier data analysis [14].
42+
43+
Snapshots of the scene captured by the cameras are shown to the left of the corresponding data plots.
44+
45+
![R200 ALT data example](figures/RealSense/R200-ALT-data-example.png)
46+
47+
Figure 1. ALT data obtained using light emitter and IR video stream of a R200 Intel RealSense camera. Snapshot of the scene taken from the R200 IR video stream is shown on the left. A person is sitting in an armchair at about 3 feet distance from the R200 camera.
48+
49+
![F200 ALT data example](figures/RealSense/F200-ALT-data-example.png)
50+
51+
Figure 2. ALT data obtained using light emitter and IR video stream of a F200 Intel RealSense camera. Snapshot of the scene taken from the F200 IR video stream is shown on the left. A person is sitting on a chair at about 3 feet distance from the F200 camera.
52+
53+
In the case of the dynamically projected patterns, as we have shown above on the example of a F200 Intel RealSence device, body movements, including the ones associated with heartbeats and respiration, lead to the changes in the non-uniform illumination distribution of the scene created by the light emitter of the F200 device, as captured by the infrared camera of the F200 device (the captured non-uniform illumination distribution forms the ‘artificial light texture’), which otherwise would have been absent provided the absence of any motion in the scene.
54+
55+
Note that, similarly to the “Pi Camera + Kinect” ALT system [10], the distance between the F200 or R200 camera and the person can affect how pronounced the heartbeat signal will be during the respiration events. Generally, the closer the camera gets to the person (after passing a certain distance point) the less pronounced the heartbeat signal component in the ALT data becomes during respiration events. Note also that at a large enough distance between the camera and the person there will be virtually no discernable pulse or respiration signal in the ALT data. Adjustments of the camera’s position can be made, for example, based on observing visualizations of the collected ALT data.
56+
57+
![PiCamera video capture with R200 emitter ALT data example](figures/RealSense/PiCamera-video-capture-with-R200-emitter-ALT-data-example.png)
58+
59+
Figure 3. ALT data obtained using light emitter of a R200 Intel RealSense camera and video stream of a Raspberry Pi NoIR camera. Snapshot of the scene taken from the Pi NoIR camera’s video stream is shown on the left. A person is resting in an armchair at about 3 feet distance from the Pi NoIR and R200 cameras. R200 camera’s emitter provided most of the illumination for the scene. We have used the code [12] from our previous “Pi Camera + Kinect” example [10] to generate ALT data shown in this Figure.
60+
61+
62+
**References**:
63+
64+
1. http://www.intel.com/content/www/us/en/architecture-and-technology/realsense-overview.html
65+
66+
2. “Non-contact real-time monitoring of heart and respiration rates using Artificial Light Texture",
67+
https://www.linkedin.com/pulse/use-artificial-light-texture-non-contact-real-time-heart-misharin
68+
69+
3. “What has happened while we slept?”,
70+
https://www.linkedin.com/pulse/what-has-happened-while-we-slept-alexander-misharin
71+
72+
4. “When your heart beats”,
73+
https://www.linkedin.com/pulse/when-your-heart-beats-alexander-misharin
74+
75+
5. “ALT pulse and respiration monitoring using Intel RealSense cameras",
76+
https://www.linkedin.com/pulse/alt-pulse-respiration-monitoring-using-intel-cameras-misharin
77+
78+
6. “Artificial Light Texture (ALT) for respiration and heart rate monitoring”, https://github.com/lvetech/ALT
79+
80+
7. https://www.raspberrypi.org/products/pi-noir-camera-v2/
81+
82+
8. “Introducing the Intel® RealSense™ R200 Camera (world facing)” https://software.intel.com/en-us/articles/realsense-r200-camera
83+
84+
9. “Can Your Webcam Do This? - Exploring the Intel® RealSense™ 3D Camera (F200)” https://software.intel.com/en-us/blogs/2015/01/26/can-your-webcam-do-this
85+
86+
10. https://github.com/lvetech/ALT/blob/master/README.md
87+
88+
11. "Sum of absolute differences", https://en.wikipedia.org/wiki/Sum_of_absolute_differences
89+
90+
12. https://github.com/lvetech/ALT/blob/master/code/simple-ALT-raw.py
91+
92+
13. “Taxicab geometry”, https://en.wikipedia.org/wiki/Taxicab_geometry
93+
94+
14. “Short-time Fourier transform” https://en.wikipedia.org/wiki/Short-time_Fourier_transform
95+
96+
<a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/4.0/"><img alt="Creative Commons License" style="border-width:0" src="https://i.creativecommons.org/l/by-nc-sa/4.0/88x31.png" /></a><br /><span xmlns:dct="http://purl.org/dc/terms/" property="dct:title">ALT</span> by <a xmlns:cc="http://creativecommons.org/ns#" href="https://www.linkedin.com/in/alexmisharin" property="cc:attributionName" rel="cc:attributionURL">Alexander Misharin, LVE Technologies LLC</a> is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>.
97+

0 commit comments

Comments
 (0)