|
1 | 1 | # Eye Tracking Examples in MRTK |
2 | 2 | This page covers how to get quickly started with using Eye Tracking in MRTK by building on our provided MRTK example package. |
| 3 | +The samples let you experience one of our new magical input capabilities: **Eye Tracking**! |
| 4 | +The demo include a number of different use cases for seamlessly combining information about what you are looking at with **Voice** and **Hand** input. |
| 5 | +This enables users to quickly and effortlessly select and move holograms across their view simply by looking at a target and saying _'Select'_ or performing a hand gesture. |
| 6 | +The demos also include an example for eye-gaze-directed scroll, pan and zoom of text and images on a slate. |
| 7 | +Finally, an example is provided for recording and visualizing the user's visual attention on a 2D slate. |
3 | 8 |
|
4 | | -<!-- TODO: Add more infos -- > |
| 9 | + |
| 10 | +## Overview of our Eye Tracking Input Tutorials |
| 11 | + |
| 12 | +[**Eye-Supported Target Selection**](/Documentation/EyeTracking/EyeTracking_TargetSelection.md) |
| 13 | + |
| 14 | +This tutorial showcases the ease of accessing eye gaze data to select targets. |
| 15 | +It includes an example for subtle yet powerful feedback to provide confidence to the user that a target is focused while not being overwhelming. |
| 16 | +In addition, we showcase a simple example of smart notifications that automatically disappear after being read. |
| 17 | + |
| 18 | +**Summary**: Fast and effortless target selections using a combination of Eyes, Voice and Hand input. |
| 19 | + |
| 20 | +<br> |
| 21 | + |
| 22 | + |
| 23 | +[**Eye-Supported Navigation**](/Documentation/EyeTracking/EyeTracking_Navigation.md) |
| 24 | + |
| 25 | +Imagine you are reading some information on a distant display or your e-reader and when you reach the end of the displayed text, the text automatically scrolls up to reveal more content. |
| 26 | +Or how about magically zooming directly toward where you were looking at? |
| 27 | +These are some of the examples showcased in this tutorial about eye-supported navigation. |
| 28 | +In addition, we added an example for hands-free rotation of 3D holograms by making them automatically rotate based on your current focus. |
| 29 | + |
| 30 | +**Summary**: Scroll, Pan, Zoom, 3D Rotation using a combination of Eyes, Voice and Hand input. |
| 31 | + |
| 32 | +<br> |
| 33 | + |
| 34 | + |
| 35 | +[**Eye-Supported Positioning**](/Documentation/EyeTracking/EyeTracking_Positioning.md) |
| 36 | + |
| 37 | +In this tutorial, we showcase a popular input scenario called “Put that there” based on research work from Bolt in the early 1980s. |
| 38 | +The idea is simple: Benefit from your eyes for fast target selection and positioning. |
| 39 | +If refinement is required, use additional input from your hands, voice or controllers. |
| 40 | + |
| 41 | +**Summary**: Positioning holograms using Eyes+Voice & Eyes+Hands (*drag-and-drop*). Eye-supported sliders using Eyes+Hands. |
| 42 | + |
| 43 | +<br> |
| 44 | + |
| 45 | + |
| 46 | +[**Visualization of Visual Attention**](/Documentation/EyeTracking/EyeTracking_Visualization.md) |
| 47 | + |
| 48 | +Information about where users looked at is an immensely powerful tool to assess work streams and improve search patterns. |
| 49 | +In this tutorial, we discuss different eye tracking visualizations and how they fit different needs. |
| 50 | +We provide you with examples for logging and loading eye tracking data and examples for how to visualize them. |
| 51 | + |
| 52 | +**Summary**: Two-dimensional attention map (heatmaps) on slates. Recording & Replaying Eye Tracking data. |
| 53 | + |
| 54 | + |
| 55 | +## To show or not to show an Eye Cursor? |
| 56 | +For your HoloLens 2 apps, we recommend to *not* show an eye cursor, as this has shown to easily distract users and break the magic for instinctively having a system react to your intentions. |
| 57 | +However, in some situations having the option to turn on an eye cursor is very helpful for identfying why the system is not reacting as expected. |
5 | 58 |
|
6 | 59 | --- |
7 | 60 | [Back to "Eye Tracking in the MixedRealityToolkit"](/Documentation/EyeTracking/EyeTracking_Main.md) |
0 commit comments