Skip to content

Commit b879621

Browse files
authored
Update EyeTracking_ExamplesOverview.md
Updating description of scenario 3 and 4.
1 parent e671fb4 commit b879621

File tree

1 file changed

+8
-7
lines changed

1 file changed

+8
-7
lines changed

Documentation/EyeTracking/EyeTracking_ExamplesOverview.md

Lines changed: 8 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ The demos also include an example for eye-gaze-directed scroll, pan and zoom of
77
Finally, an example is provided for recording and visualizing the user's visual attention on a 2D slate.
88

99

10-
## Overview of our Eye Tracking Input Tutorials
10+
## Overview of MRTK Eye Tracking Samples
1111

1212
[**Eye-Supported Target Selection**](/Documentation/EyeTracking/EyeTracking_TargetSelection.md)
1313

@@ -34,22 +34,23 @@ In addition, we added an example for hands-free rotation of 3D holograms by maki
3434

3535
[**Eye-Supported Positioning**](/Documentation/EyeTracking/EyeTracking_Positioning.md)
3636

37-
In this tutorial, we showcase a popular input scenario called Put that there” based on research work from Bolt in the early 1980s.
37+
In this tutorial, we extend an input scenario called [Put that there](href:https://youtu.be/CbIn8p4_4CQ) dating back to research from the MIT Media Lab in the early 1980's with eye, hand and voice input.
3838
The idea is simple: Benefit from your eyes for fast target selection and positioning.
39-
If refinement is required, use additional input from your hands, voice or controllers.
39+
Simply look at a hologram and say _'put this'_, look over where you want to place it and say _'there!'_.
40+
For more precisely positionig your hologram, you can use additional input from your hands, voice or controllers.
4041

41-
**Summary**: Positioning holograms using Eyes+Voice & Eyes+Hands (*drag-and-drop*). Eye-supported sliders using Eyes+Hands.
42+
**Summary**: Positioning holograms using Eyes, Voice and Hand input (*drag-and-drop*). Eye-supported sliders using Eyes+Hands.
4243

4344
<br>
4445

4546

4647
[**Visualization of Visual Attention**](/Documentation/EyeTracking/EyeTracking_Visualization.md)
4748

48-
Information about where users looked at is an immensely powerful tool to assess work streams and improve search patterns.
49+
Information about where users look at is an immensely powerful tool to assess usability of a design and to identify problems in efficivent work streams.
4950
In this tutorial, we discuss different eye tracking visualizations and how they fit different needs.
50-
We provide you with examples for logging and loading eye tracking data and examples for how to visualize them.
51+
We provide basic examples for logging and loading Eye Tracking data and examples for how to visualize them.
5152

52-
**Summary**: Two-dimensional attention map (heatmaps) on slates. Recording & Replaying Eye Tracking data.
53+
**Summary**: Two-dimensional attention map (heatmaps) on slates. Recording & replaying Eye Tracking data.
5354

5455
---
5556
[Back to "Eye Tracking in the MixedRealityToolkit"](/Documentation/EyeTracking/EyeTracking_Main.md)

0 commit comments

Comments
 (0)