You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This page discusses different options for accessing eye gaze data and eye gaze specific events to select targets in MRTK.
6
5
Eye Tracking allows for fast and effortless target selections using a combination of information about what a user is looking at with additional inputs such as
7
-
*hand tracking* and *voice commands*:
6
+
_hand tracking_ and _voice commands_:
8
7
- Look & Pinch (i.e., hold up your hand in front of you and pinch your thumb and index finger together)
9
-
- Look & Say *"Select"* (default voice command)
10
-
- Look & Say *"Explode"* or *"Pop"* (custom voice commands)
8
+
- Look & Say _"Select"_ (default voice command)
9
+
- Look & Say _"Explode"_ or _"Pop"_ (custom voice commands)
11
10
- Look & Bluetooth button
12
11
13
12
## Target Selection
@@ -17,8 +16,8 @@ holograms using their eyes is the same as for any other focus input (e.g., head
17
16
This provides the great advantage of a flexible way to interact with your holograms by defining the main focus type in your MRTK Input Pointer Profile depending
18
17
on your user's needs, while leaving your code untouched.
19
18
For example, this would enable to switch between head or eye gaze without changing a line of code.
20
-
To detect when a hologram is focused at, use the *'IMixedRealityFocusHandler'* interface that provides you with two interface members: *OnFocusEnter* and
21
-
*OnFocusExit*.
19
+
To detect when a hologram is focused at, use the _'IMixedRealityFocusHandler'_ interface that provides you with two interface members: _OnFocusEnter_ and
20
+
_OnFocusExit_.
22
21
23
22
Here is a simple example from [ColorTap.cs](/Assets/MixedRealityToolkit.Examples/Demos/EyeTracking/Demo_BasicSetup/Scripts/ColorTap.cs) to change a hologram's
24
23
color when being looked at.
@@ -41,12 +40,12 @@ color when being looked at.
41
40
42
41
#### Selecting a Focused Hologram
43
42
To select focused holograms, we can use Input Event Listeners to confirm a selection.
44
-
For example, you can add the *IMixedRealityPointerHandler* to react to simple pointer input.
45
-
The *IMixedRealityPointerHandler* interface requires you to implement the following three interface members:
46
-
*OnPointerUp*, *OnPointerDown*, and *OnPointerClicked*.
43
+
For example, you can add the _IMixedRealityPointerHandler_ to react to simple pointer input.
44
+
The _IMixedRealityPointerHandler_ interface requires you to implement the following three interface members:
45
+
_OnPointerUp_, _OnPointerDown_, and _OnPointerClicked_.
47
46
48
-
The *MixedRealityInputAction* is a configurable list of actions that you want to distinguish in your app and can be edited in the
@@ -78,9 +77,9 @@ The *MixedRealityInputAction* is a configurable list of actions that you want to
78
77
79
78
80
79
81
-
### Use Eye-Gaze-Specific *BaseEyeFocusHandler*
80
+
### Use Eye-Gaze-Specific _BaseEyeFocusHandler_
82
81
Given that eye gaze can be very different to other pointer inputs, you may want to make sure to only react to the focus if it is eye gaze.
83
-
Similar to the *FocusHandler*, the *BaseEyeFocusHandler* is specific Eye Tracking.
82
+
Similar to the _FocusHandler_, the _BaseEyeFocusHandler_ is specific Eye Tracking.
84
83
85
84
Here is an example from [mrtk_eyes_02_TargetSelection.unity](/Assets/MixedRealityToolkit.Examples/Demos/EyeTracking/Scenes/mrtk_eyes_02_TargetSelection.unity).
86
85
Having the [OnLookAt_Rotate.cs]() attached, a GameObject will rotate while being looked at.
@@ -106,7 +105,7 @@ Having the [OnLookAt_Rotate.cs]() attached, a GameObject will rotate while being
106
105
}
107
106
```
108
107
109
-
The *BaseEyeFocusHandler* provides more than only *'OnEyeFocusStay'*. Here is an overview of other events:
108
+
The _BaseEyeFocusHandler_ provides more than only _OnEyeFocusStay_. Here is an overview of other events:
110
109
111
110
```csharp
112
111
/// <summary>
@@ -142,7 +141,7 @@ This has two advantages:
142
141
143
142
#### Example: Attentive Notifications
144
143
For example, in [mrtk_eyes_02_TargetSelection.unity](/Assets/MixedRealityToolkit.Examples/Demos/EyeTracking/Scenes/mrtk_eyes_02_TargetSelection.unity),
145
-
you can find an example for *'smart attentive notifications'* that react to your eye gaze.
144
+
you can find an example for _'smart attentive notifications'_ that react to your eye gaze.
146
145
These are 3D text boxes that can be placed in the scene and that will smoothly enlarge and turn toward the user when being looked at to ease legibility.
147
146
While the user is reading the notification, the information keeps getting displayed crisp and clear.
148
147
After reading it and looking away from the notification, the notification will automatically be dismissed and fades out.
@@ -155,18 +154,18 @@ The advantage of this approach is that the same scripts can be reused by various
155
154
For example, a hologram may start facing the user based on a voice commands or after pressing virtual button.
156
155
To trigger these events, you can simply reference the methods that should be executed in the [EyeTrackingTarget](/Assets/MixedRealityToolkit.SDK/Features/Input/Handlers/EyeTrackingTarget.cs)
157
156
script that is attached to your GameObject.
158
-
For the example of the *'smart attentive notifications'*, the following happens:
157
+
For the example of the _'smart attentive notifications'_, the following happens:
159
158
-**OnLookAtStart()**: The notification starts to...
160
159
-*FaceUser.Engage:* ... turn toward the user.
161
-
-*ChangeSize.Engage:* ... increase in size *(up to a specified maximum scale)*.
162
-
-*BlendOut.Engage:* ... starts to blend in more *(after being at a more subtle idle state)*.
160
+
-*ChangeSize.Engage:* ... increase in size _(up to a specified maximum scale)_.
161
+
-*BlendOut.Engage:* ... starts to blend in more _(after being at a more subtle idle state)_.
163
162
164
-
-**OnDwell()**: Informs the '*BlendOut*' script that the notification has been looked at sufficiently.
163
+
-**OnDwell()**: Informs the _BlendOut_ script that the notification has been looked at sufficiently.
165
164
166
165
-**OnLookAway()**: The notification starts to...
167
166
-*FaceUser.Disengage:* ... turn back to its original orientation.
168
167
-*ChangeSize.Disengage:* ... decrease back to its original size.
169
-
-*BlendOut.Disengage:* ... starts to blend out - If *'OnDwell()'* was triggered, blend out completely and destroy, otherwise back to its idle state.
168
+
-*BlendOut.Disengage:* ... starts to blend out - If _OnDwell()_ was triggered, blend out completely and destroy, otherwise back to its idle state.
170
169
171
170
**Design Consideration:**
172
171
The key to an enjoyable experience here is to carefully tune the speed of any of these behaviors to avoid causing discomfort by reacting to the user’s eye gaze too quickly all the time.
@@ -177,47 +176,38 @@ Otherwise this can quickly feel extremely overwhelming.
The interesting part is *how* the selection is triggered.
191
+
The interesting part is _how_ the selection is triggered.
193
192
The [EyeTrackingTarget](/Assets/MixedRealityToolkit.SDK/Features/Input/Handlers/EyeTrackingTarget.cs)
194
193
allows for quickly assigning different ways to invoke a selection:
195
194
196
-
-*Pinch gesture*: Setting the 'Select Action' to 'Select' uses the default hand gesture to trigger the selection. This means that the user can simply raise their hand and pinch their thumb and index finger together to confirm the selection.
195
+
-_Pinch gesture_: Setting the 'Select Action' to 'Select' uses the default hand gesture to trigger the selection. This means that the user can simply raise their hand and pinch their thumb and index finger together to confirm the selection.
197
196
198
-
- Say *"Select"*: Use the default voice command *"Select"* for selecting a hologram.
197
+
- Say _"Select"_: Use the default voice command _"Select"_ for selecting a hologram.
199
198
200
-
- Say *"Explode"* or *"Pop"*: To use custom voice commands, you need to follow two steps:
201
-
1. Set up a custom action such as *"DestroyTarget"*
202
-
- Navigate to *'Input System Profile'* -> *'Input Actions Profile'*
199
+
- Say _"Explode"_ or _"Pop"_: To use custom voice commands, you need to follow two steps:
200
+
1. Set up a custom action such as _"DestroyTarget"_
201
+
- Navigate to _'Input System Profile'_ -> _'Input Actions Profile'_
203
202
- Add new action
204
203
205
-
2. Set up the voice commands that trigger this action such as *"Explode"* or *"Pop"*
206
-
- Navigate to *'Input System Profile'* -> *'Speech Commands Profile'*
204
+
2. Set up the voice commands that trigger this action such as _"Explode"_ or _"Pop"_
205
+
- Navigate to _'Input System Profile'_ -> _'Speech Commands Profile'_
207
206
- Add new speech command and associate the action you just created
208
-
- Assign a *'KeyCode'* to allow for triggering the action via a button press
209
-
210
-
207
+
- Assign a _'KeyCode'_ to allow for triggering the action via a button press
211
208
<br>
212
209
213
210
This should get you started in accessing Eye Tracking data in your MRTK Unity app!
214
211
215
212
---
216
213
[Back to "Eye Tracking in the MixedRealityToolkit"](/Documentation/EyeTracking/EyeTracking_Main.md)
0 commit comments