You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In this tutorial, we showcase a popular input scenario called “Put that there” based on research work from Bolt in the early 1980s.
37
+
In this tutorial, we extend an input scenario called [Put that there](href:https://youtu.be/CbIn8p4_4CQ) dating back to research from the MIT Media Lab in the early 1980's with eye, hand and voice input.
38
38
The idea is simple: Benefit from your eyes for fast target selection and positioning.
39
-
If refinement is required, use additional input from your hands, voice or controllers.
39
+
Simply look at a hologram and say _'put this'_, look over where you want to place it and say _'there!'_.
40
+
For more precisely positionig your hologram, you can use additional input from your hands, voice or controllers.
40
41
41
-
**Summary**: Positioning holograms using Eyes+Voice & Eyes+Hands (*drag-and-drop*). Eye-supported sliders using Eyes+Hands.
42
+
**Summary**: Positioning holograms using Eyes, Voice and Hand input (*drag-and-drop*). Eye-supported sliders using Eyes+Hands.
42
43
43
44
<br>
44
45
45
46
46
47
[**Visualization of Visual Attention**](/Documentation/EyeTracking/EyeTracking_Visualization.md)
47
48
48
-
Information about where users looked at is an immensely powerful tool to assess work streams and improve search patterns.
49
+
Information about where users look at is an immensely powerful tool to assess usability of a design and to identify problems in efficivent work streams.
49
50
In this tutorial, we discuss different eye tracking visualizations and how they fit different needs.
50
-
We provide you with examples for logging and loading eye tracking data and examples for how to visualize them.
51
+
We provide basic examples for logging and loading Eye Tracking data and examples for how to visualize them.
0 commit comments