Skip to content

Commit 240063c

Browse files
authored
Merge pull request #3855 from sostel/sostel-patch-1
Eye Tracking documentation kick off
2 parents 2a953e3 + 6dd2086 commit 240063c

31 files changed

+579
-4
lines changed
Lines changed: 81 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,81 @@
1+
# Getting started with Eye Tracking in MRTK
2+
This page covers how to set up your Unity MRTK scene to use Eye Tracking in your app.
3+
The following assumes you are starting out with a fresh new scene.
4+
Alternatively, you can check out our already configured Eye Tracking examples:
5+
[MRTK Eye Tracking Examples](EyeTracking_ExamplesOverview.md).
6+
7+
8+
### Setting up the scene
9+
Set up the _MixedRealityToolkit_ by simply clicking _'Mixed Reality Toolkit -> Configure…'_ in the menu bar.
10+
11+
![MRTK](../../External/ReadMeImages/EyeTracking/mrtk_setup_configure.png)
12+
13+
14+
### Setting up the MRTK profiles required for Eye Tracking
15+
After setting up your MRTK scene, you will be asked to choose a profile for MRTK.
16+
You can simply select _DefaultMixedRealityToolkitConfigurationProfile_ and then select the _'Copy & Customize'_ option.
17+
18+
![MRTK](../../External/ReadMeImages/EyeTracking/mrtk_setup_configprofile.png)
19+
20+
21+
### Create an "Eye Gaze Data Provider"
22+
- Navigate to the _'Input System Profile'_ in your MRTK main profile.
23+
24+
- To edit the default one ( _'DefaultMixedRealityInputSystemProfile'_ ), click the _'Clone'_ button next to it.
25+
26+
- Double click on your new input profile and select _'+ Add Data Provider'_.
27+
28+
- Create a new data provider:
29+
- Under **Type** select _'Microsoft.MixedReality.Toolkit.WindowsMixedReality.Input'_ -> _'WindowsMixedRealityEyeGazeDataProvider'_
30+
31+
- For **Platform(s)** select _'Windows Universal'_.
32+
33+
![MRTK](../../External/ReadMeImages/EyeTracking/mrtk_setup_eyes_dataprovider.png)
34+
35+
36+
### Enabling Eye Tracking in the GazeProvider
37+
In HoloLens v1, head gaze was used as primary pointing technique.
38+
While head gaze is still available via the _GazeProvider_ in MRTK which is attached to your [Camera](https://docs.unity3d.com/ScriptReference/Camera.html), you can check to use eye gaze instead by ticking the _'Prefer Eye Tracking'_ checkbox as shown in the screenshot below.
39+
40+
![MRTK](../../External/ReadMeImages/EyeTracking/mrtk_setup_eyes_gazeprovider.png)
41+
42+
43+
### Simulating Eye Tracking in the Unity Editor
44+
You can simulate Eye Tracking input in the Unity Editor to ensure that events are correctly triggered before deploying the app to your HoloLens 2.
45+
The eye gaze signal is simulated by simply using the camera's location as eye gaze origin and the camera's forward vector as eye gaze direction.
46+
While this is great for initial testing, please note that it is not a good imitation for rapid eye movements.
47+
For this, it is better to ensure frequent tests of your eye-based interactions on the HoloLens 2.
48+
49+
1. **Enable simulated Eye Tracking**:
50+
- Navigate to your main _'MRTK Configuration Profile'_ -> _'Input System Profile'_ -> _'Data Providers'_ -> _'Input Simulation Service'_.
51+
- Check the _'Simulate Eye Position'_ checkbox.
52+
53+
![MRTK](../../External/ReadMeImages/EyeTracking/mrtk_setup_eyes_simulate.png)
54+
55+
2. **Disable default head gaze cursor**:
56+
In general, we recommend to avoid showing an eye gaze cursor or if you insist on showing one to make it _very_ subtle.
57+
Check out our [eye gaze cursor tutorial](EyeTracking_Cursor.md) for more information on how to best handle it.
58+
We do recommend to hide the default head gaze cursor that is attached to the MRTK gaze pointer profile by default.
59+
- Navigate to your main _'MRTK Configuration Profile'_ -> _'Input System Profile'_ -> _'PointerSettings.PointerProfile'_
60+
- At the bottom of the _'PointerProfile'_, you should assign an invisible cursor prefab to the _'GazeCursor'_. If you downloaded the MRTK Examples folder, you can simply reference the included -'EyeGazeCursor'_ prefab.
61+
62+
![MRTK](../../External/ReadMeImages/EyeTracking/mrtk_setup_eyes_gazesettings.png)
63+
64+
### Accessing eye gaze data
65+
Now that your scene is set up to use Eye Tracking, let's take a look at how to access it in your scripts:
66+
[Accessing Eye Tracking Data in your Unity Script](EyeTracking_EyeGazeProvider.md).
67+
68+
69+
### Testing your Unity app on a HoloLens 2
70+
Building your app with Eye Tracking should be similar to how you would compile other HoloLens 2 MRTK apps.
71+
The only difference is that the *'Gaze Input'* capability is unfortunately not yet supported by Unity under 'Player Settings -> Publishing Settings -> Capabilities'.
72+
To use Eye Tracking on your HoloLens 2 device, you need to manually edit the package manifest that is part of your build Visual Studio project.
73+
So, follow these steps:
74+
1. Build your Unity project as you would normally do for _HoloLens 2_.
75+
2. Open your compiled Visual Studio project and then open the _'Package.appxmanifest'_ in your solution.
76+
3. Make sure to tick the _'Gaze Input'_ checkbox under _Capabilities_.
77+
78+
![Enabling Gaze Input in Visual Studio](../../External/ReadMeImages/EyeTracking/mrtk_et_gazeinput.jpg)
79+
80+
---
81+
[Back to "Eye Tracking in the MixedRealityToolkit"](EyeTracking_Main.md)
Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
# To show or not to show an eye gaze cursor
2+
3+
This page covers design guidelines on when and how to use an eye gaze cursor.
4+
5+
<!-- TODO: Add more infos -- >
6+
7+
---
8+
[Back to "Eye Tracking in the MixedRealityToolkit"](EyeTracking_Main.md)
Lines changed: 56 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,56 @@
1+
# Eye Tracking Examples in MRTK
2+
This page covers how to get quickly started with using Eye Tracking in MRTK by building on our provided MRTK example package.
3+
The samples let you experience one of our new magical input capabilities: **Eye Tracking**!
4+
The demo include a number of different use cases for seamlessly combining information about what you are looking at with **Voice** and **Hand** input.
5+
This enables users to quickly and effortlessly select and move holograms across their view simply by looking at a target and saying _'Select'_ or performing a hand gesture.
6+
The demos also include an example for eye-gaze-directed scroll, pan and zoom of text and images on a slate.
7+
Finally, an example is provided for recording and visualizing the user's visual attention on a 2D slate.
8+
9+
10+
## Overview of MRTK Eye Tracking Samples
11+
12+
[**Eye-Supported Target Selection**](EyeTracking_TargetSelection.md)
13+
14+
This tutorial showcases the ease of accessing eye gaze data to select targets.
15+
It includes an example for subtle yet powerful feedback to provide confidence to the user that a target is focused while not being overwhelming.
16+
In addition, we showcase a simple example of smart notifications that automatically disappear after being read.
17+
18+
**Summary**: Fast and effortless target selections using a combination of Eyes, Voice and Hand input.
19+
20+
<br>
21+
22+
23+
[**Eye-Supported Navigation**](EyeTracking_Navigation.md)
24+
25+
Imagine you are reading some information on a distant display or your e-reader and when you reach the end of the displayed text, the text automatically scrolls up to reveal more content.
26+
Or how about magically zooming directly toward where you were looking at?
27+
These are some of the examples showcased in this tutorial about eye-supported navigation.
28+
In addition, we added an example for hands-free rotation of 3D holograms by making them automatically rotate based on your current focus.
29+
30+
**Summary**: Scroll, Pan, Zoom, 3D Rotation using a combination of Eyes, Voice and Hand input.
31+
32+
<br>
33+
34+
35+
[**Eye-Supported Positioning**](EyeTracking_Positioning.md)
36+
37+
In this tutorial, we extend an input scenario called [Put that there](https://youtu.be/CbIn8p4_4CQ) dating back to research from the MIT Media Lab in the early 1980's with eye, hand and voice input.
38+
The idea is simple: Benefit from your eyes for fast target selection and positioning.
39+
Simply look at a hologram and say _'put this'_, look over where you want to place it and say _'there!'_.
40+
For more precisely positioning your hologram, you can use additional input from your hands, voice or controllers.
41+
42+
**Summary**: Positioning holograms using Eyes, Voice and Hand input (*drag-and-drop*). Eye-supported sliders using Eyes+Hands.
43+
44+
<br>
45+
46+
47+
[**Visualization of Visual Attention**](EyeTracking_Visualization.md)
48+
49+
Information about where users look at is an immensely powerful tool to assess usability of a design and to identify problems in efficient work streams.
50+
In this tutorial, we discuss different eye tracking visualizations and how they fit different needs.
51+
We provide basic examples for logging and loading Eye Tracking data and examples for how to visualize them.
52+
53+
**Summary**: Two-dimensional attention map (heatmaps) on slates. Recording & replaying Eye Tracking data.
54+
55+
---
56+
[Back to "Eye Tracking in the MixedRealityToolkit"](EyeTracking_Main.md)
Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
# Accessing Eye Tracking Data in your Unity Script
2+
3+
The following assumes that you followed the steps for setting up Eye Tracking in your MRTK scene (see [Basic MRTK Setup to use Eye Tracking](EyeTracking_BasicSetup.md)).
4+
5+
To access Eye Tracking data in your MonoBehaviour scripts is easy! Simply use *MixedRealityToolkit.InputSystem.EyeGazeProvider*.
6+
7+
## MixedRealityToolkit.InputSystem.EyeGazeProvider
8+
While the *MixedRealityToolkit.InputSystem.EyeGazeProvider* provides several helpful variables, the key ones for Eye Tracking input are the following:
9+
10+
- **UseEyeTracking**:
11+
True if Eye Tracking hardware is available and the user has given permission to use Eye Tracking in the app.
12+
13+
- **IsEyeGazeValid**:
14+
Indicates whether the current Eye Tracking data is valid.
15+
It may be invalid due to exceeded timeout (should be robust to the user blinking though) or lack of tracking hardware or permissions.
16+
17+
- **GazeOrigin**:
18+
Origin of the gaze ray.
19+
Please note that this will return the *head* gaze origin if 'IsEyeGazeValid' is false.
20+
21+
- **GazeDirection**:
22+
Direction of the gaze ray.
23+
This will return the *head* gaze direction if 'IsEyeGazeValid' is false.
24+
25+
- **HitInfo**, **HitPosition**, **HitNormal**, etc.:
26+
Information about the currently gazed at target.
27+
Again, if 'IsEyeGazeValid' is false, this will be based on the user's *head* gaze.
28+
29+
30+
## Examples for using MixedRealityToolkit.InputSystem.EyeGazeProvider
31+
Here is an example from the
32+
[FollowEyeGaze.cs](xref:Microsoft.MixedReality.Toolkit.Examples.Demos.EyeTracking.FollowEyeGaze):
33+
34+
- Get the point of a hologram that the user is looking at:
35+
```csharp
36+
// Show the object at the hit position of the user's eye gaze ray with the target.
37+
gameObject.transform.position = MixedRealityToolkit.InputSystem.EyeGazeProvider.HitPosition;
38+
```
39+
40+
41+
42+
- Showing a visual asset at a fixed distance from where the user is currently looking:
43+
```csharp
44+
// If no target is hit, show the object at a default distance along the gaze ray.
45+
gameObject.transform.position =
46+
MixedRealityToolkit.InputSystem.EyeGazeProvider.GazeOrigin +
47+
MixedRealityToolkit.InputSystem.EyeGazeProvider.GazeDirection.normalized * defaultDistanceInMeters;
48+
```
49+
50+
51+
52+
---
53+
[Back to "Eye Tracking in the MixedRealityToolkit"](EyeTracking_Main.md)
Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
![Eye Tracking in MRTK](../../External/ReadMeImages/EyeTracking/mrtk_et_compilation.png)
2+
# Eye Tracking in the Mixed Reality Toolkit
3+
4+
The [Mixed Reality Toolkit](https://github.com/Microsoft/MixedRealityToolkit-Unity) (MRTK) supports amongst others _'HoloLens 2'_ which offers an exciting and powerful new input: Eye Tracking!
5+
Eye Tracking enables users to quickly and effortlessly engage with holograms across their view and can make your system smarter by better identifying a user's intention.
6+
New to Eye Tracking? No problem! We have created a number of videos, tutorials and samples to get you started!
7+
1. [Getting started with Eye Tracking in MRTK](EyeTracking_BasicSetup.md)
8+
9+
2. [Building on the MRTK Eye Tracking samples](EyeTracking_ExamplesOverview.md)
10+
---

0 commit comments

Comments
 (0)