Skip to content

Commit 6b90aef

Browse files
authored
Update EyeTracking_Main.md
Moved references to the individual example scenes to its own landing page. Added a reference to a header picture.
1 parent 0c5aa59 commit 6b90aef

File tree

1 file changed

+8
-65
lines changed

1 file changed

+8
-65
lines changed
Lines changed: 8 additions & 65 deletions
Original file line numberDiff line numberDiff line change
@@ -1,67 +1,10 @@
1-
<!-- ![MRTK](/External/ReadMeImages/EyeTracking/MRTK_et_placeholder.png) -->
2-
# Eye Tracking in the MixedRealityToolkit
1+
![Eye Tracking in MRTK](/External/ReadMeImages/EyeTracking/mrtk_et_compilation.png)
2+
# Eye Tracking in the Mixed Reality Toolkit
33

4-
The Mixed Reality Toolkit supports HoloLens 2 which offers Eye Tracking input.
5-
MRTK offers several examples for how to utilize Eye Tracking in your applications.
6-
Eye Tracking enables users to quickly and effortlessly engage with holograms across their view.
7-
Below you can find an overview of several powerful examples on how to use Eye Tracking in your app.
4+
The [Mixed Reality Toolkit](href:https://github.com/Microsoft/MixedRealityToolkit-Unity) (MRTK) supports amongst others _'HoloLens 2'_ which offers an exciting and powerful new input: Eye Tracking!
5+
Eye Tracking enables users to quickly and effortlessly engage with holograms across their view and can make your system smarter by better identifying a user's intention.
6+
New to Eye Tracking? No problem! We have created a number of videos, tutorials and samples to get you started!
7+
1. [Getting started with Eye Tracking in MRTK](/Documentation/EyeTracking/EyeTracking_BasicSetup.md)
88

9-
You can directly build on the samples provided with MRTK which are stored in the following folder:
10-
[\Assets\MixedRealityToolkit.Examples\Demos\EyeTracking](/Assets/MixedRealityToolkit.Examples/Demos/EyeTracking)
11-
12-
If you want to start from a new Unity scene, check out the instructions on [Basic MRTK Setup to use Eye Tracking](/Documentation/EyeTracking/EyeTracking_BasicSetup.md).
13-
14-
<br>
15-
16-
17-
## Overview of our Eye Tracking Input Tutorials
18-
19-
[**Eye-Supported Target Selection**](/Documentation/EyeTracking/EyeTracking_TargetSelection.md)
20-
21-
This tutorial showcases the ease of using GazeProvider to access smoothed eye gaze data and eye gaze specific events to select targets. Several examples are shown
22-
for subtle yet powerful feedback such as blending in/out visual highlights or holograms slowly turning towards the user when being looked at and notifications
23-
disappearing after they are read.
24-
25-
**Summary**: Fast and effortless target selections using a combination of Eyes+Voice and Eyes+Hands.
26-
27-
<br>
28-
29-
30-
[**Eye-Supported Navigation**](/Documentation/EyeTracking/EyeTracking_Navigation.md)
31-
32-
Imagine you’re reading information on a slate and when you reach the end of the displayed text, the text automatically scrolls up to reveal more content.
33-
Or you can fluently zoom in where you’re looking at and that map automatically adjusts the content when you get closer to the border to keep your looked at content in view.
34-
These are some of the examples showcased in this tutorial about eye-supported navigation.
35-
Another interesting application for hands-free observation of 3D holograms is automatically turning looked at aspects of your hologram to the front.
36-
37-
**Summary**: Scroll, Pan, Zoom, 3D Rotation using Eyes+Voice and Eyes+Hands.
38-
39-
<br>
40-
41-
42-
[**Eye-Supported Positioning**](/Documentation/EyeTracking/EyeTracking_Positioning.md)
43-
44-
In this tutorial, we showcase a popular input scenario called “Put that there” based on research work from Bolt in the early 1980s.
45-
The idea is simple: Benefit from your eyes for fast target selection and positioning.
46-
If refinement is required, use additional input from your hands, voice or controllers.
47-
48-
**Summary**: Positioning holograms using Eyes+Voice & Eyes+Hands (*drag-and-drop*). Eye-supported sliders using Eyes+Hands.
49-
50-
<br>
51-
52-
53-
[**Visualization of Visual Attention**](/Documentation/EyeTracking/EyeTracking_Visualization.md)
54-
55-
Information about where users looked at is an immensely powerful tool to assess work streams and improve search patterns.
56-
In this tutorial, we discuss different eye tracking visualizations and how they fit different needs.
57-
We provide you with examples for logging and loading eye tracking data and examples for how to visualize them.
58-
59-
**Summary**: Two-dimensional attention map (heatmaps) on slates. Recording & Replaying Eye Tracking data.
60-
61-
62-
## To show or not to show an Eye Cursor?
63-
For your HoloLens 2 apps, we recommend to *not* show an eye cursor, as this has shown to easily distract users and break the magic for instinctively having a system react to your intentions.
64-
However, in some situations having the option to turn on an eye cursor is very helpful for identfying why the system is not reacting as expected.
65-
66-
<br>
67-
---
9+
2. [Building on the MRTK Eye Tracking samples](/Documentation/EyeTracking/EyeTracking_ExamplesOverview.md)
10+
---

0 commit comments

Comments
 (0)