Skip to content

Commit c9ac2c7

Browse files
authored
Update EyeTracking_TargetSelection.md
Updating main picture. Additional minor formatting issues.
1 parent b879621 commit c9ac2c7

File tree

1 file changed

+32
-42
lines changed

1 file changed

+32
-42
lines changed

Documentation/EyeTracking/EyeTracking_TargetSelection.md

Lines changed: 32 additions & 42 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,12 @@
1-
![MRTK](/External/ReadMeImages/EyeTracking/MRTK_ET_TargetSelect.png =750x450)
1+
![MRTK](/External/ReadMeImages/EyeTracking/mrtk_et_targetselect.png)
22

33
# Eye-Supported Target Selection
4-
54
This page discusses different options for accessing eye gaze data and eye gaze specific events to select targets in MRTK.
65
Eye Tracking allows for fast and effortless target selections using a combination of information about what a user is looking at with additional inputs such as
7-
*hand tracking* and *voice commands*:
6+
_hand tracking_ and _voice commands_:
87
- Look & Pinch (i.e., hold up your hand in front of you and pinch your thumb and index finger together)
9-
- Look & Say *"Select"* (default voice command)
10-
- Look & Say *"Explode"* or *"Pop"* (custom voice commands)
8+
- Look & Say _"Select"_ (default voice command)
9+
- Look & Say _"Explode"_ or _"Pop"_ (custom voice commands)
1110
- Look & Bluetooth button
1211

1312
## Target Selection
@@ -17,8 +16,8 @@ holograms using their eyes is the same as for any other focus input (e.g., head
1716
This provides the great advantage of a flexible way to interact with your holograms by defining the main focus type in your MRTK Input Pointer Profile depending
1817
on your user's needs, while leaving your code untouched.
1918
For example, this would enable to switch between head or eye gaze without changing a line of code.
20-
To detect when a hologram is focused at, use the *'IMixedRealityFocusHandler'* interface that provides you with two interface members: *OnFocusEnter* and
21-
*OnFocusExit*.
19+
To detect when a hologram is focused at, use the _'IMixedRealityFocusHandler'_ interface that provides you with two interface members: _OnFocusEnter_ and
20+
_OnFocusExit_.
2221

2322
Here is a simple example from [ColorTap.cs](/Assets/MixedRealityToolkit.Examples/Demos/EyeTracking/Demo_BasicSetup/Scripts/ColorTap.cs) to change a hologram's
2423
color when being looked at.
@@ -41,12 +40,12 @@ color when being looked at.
4140

4241
#### Selecting a Focused Hologram
4342
To select focused holograms, we can use Input Event Listeners to confirm a selection.
44-
For example, you can add the *IMixedRealityPointerHandler* to react to simple pointer input.
45-
The *IMixedRealityPointerHandler* interface requires you to implement the following three interface members:
46-
*OnPointerUp*, *OnPointerDown*, and *OnPointerClicked*.
43+
For example, you can add the _IMixedRealityPointerHandler_ to react to simple pointer input.
44+
The _IMixedRealityPointerHandler_ interface requires you to implement the following three interface members:
45+
_OnPointerUp_, _OnPointerDown_, and _OnPointerClicked_.
4746

48-
The *MixedRealityInputAction* is a configurable list of actions that you want to distinguish in your app and can be edited in the
49-
*MRTK Configuration Profile* -> *Input System Profile* -> *Input Actions Profile*.
47+
The _MixedRealityInputAction_ is a configurable list of actions that you want to distinguish in your app and can be edited in the
48+
_MRTK Configuration Profile_ -> _Input System Profile_ -> _Input Actions Profile_.
5049

5150
```csharp
5251
public class ColorTap : MonoBehaviour, IMixedRealityFocusHandler, IMixedRealityPointerHandler
@@ -78,9 +77,9 @@ The *MixedRealityInputAction* is a configurable list of actions that you want to
7877

7978

8079

81-
### Use Eye-Gaze-Specific *BaseEyeFocusHandler*
80+
### Use Eye-Gaze-Specific _BaseEyeFocusHandler_
8281
Given that eye gaze can be very different to other pointer inputs, you may want to make sure to only react to the focus if it is eye gaze.
83-
Similar to the *FocusHandler*, the *BaseEyeFocusHandler* is specific Eye Tracking.
82+
Similar to the _FocusHandler_, the _BaseEyeFocusHandler_ is specific Eye Tracking.
8483

8584
Here is an example from [mrtk_eyes_02_TargetSelection.unity](/Assets/MixedRealityToolkit.Examples/Demos/EyeTracking/Scenes/mrtk_eyes_02_TargetSelection.unity).
8685
Having the [OnLookAt_Rotate.cs]() attached, a GameObject will rotate while being looked at.
@@ -106,7 +105,7 @@ Having the [OnLookAt_Rotate.cs]() attached, a GameObject will rotate while being
106105
}
107106
```
108107

109-
The *BaseEyeFocusHandler* provides more than only *'OnEyeFocusStay'*. Here is an overview of other events:
108+
The _BaseEyeFocusHandler_ provides more than only _OnEyeFocusStay_. Here is an overview of other events:
110109

111110
```csharp
112111
/// <summary>
@@ -142,7 +141,7 @@ This has two advantages:
142141

143142
#### Example: Attentive Notifications
144143
For example, in [mrtk_eyes_02_TargetSelection.unity](/Assets/MixedRealityToolkit.Examples/Demos/EyeTracking/Scenes/mrtk_eyes_02_TargetSelection.unity),
145-
you can find an example for *'smart attentive notifications'* that react to your eye gaze.
144+
you can find an example for _'smart attentive notifications'_ that react to your eye gaze.
146145
These are 3D text boxes that can be placed in the scene and that will smoothly enlarge and turn toward the user when being looked at to ease legibility.
147146
While the user is reading the notification, the information keeps getting displayed crisp and clear.
148147
After reading it and looking away from the notification, the notification will automatically be dismissed and fades out.
@@ -155,18 +154,18 @@ The advantage of this approach is that the same scripts can be reused by various
155154
For example, a hologram may start facing the user based on a voice commands or after pressing virtual button.
156155
To trigger these events, you can simply reference the methods that should be executed in the [EyeTrackingTarget](/Assets/MixedRealityToolkit.SDK/Features/Input/Handlers/EyeTrackingTarget.cs)
157156
script that is attached to your GameObject.
158-
For the example of the *'smart attentive notifications'*, the following happens:
157+
For the example of the _'smart attentive notifications'_, the following happens:
159158
- **OnLookAtStart()**: The notification starts to...
160159
- *FaceUser.Engage:* ... turn toward the user.
161-
- *ChangeSize.Engage:* ... increase in size *(up to a specified maximum scale)*.
162-
- *BlendOut.Engage:* ... starts to blend in more *(after being at a more subtle idle state)*.
160+
- *ChangeSize.Engage:* ... increase in size _(up to a specified maximum scale)_.
161+
- *BlendOut.Engage:* ... starts to blend in more _(after being at a more subtle idle state)_.
163162

164-
- **OnDwell()**: Informs the '*BlendOut*' script that the notification has been looked at sufficiently.
163+
- **OnDwell()**: Informs the _BlendOut_ script that the notification has been looked at sufficiently.
165164

166165
- **OnLookAway()**: The notification starts to...
167166
- *FaceUser.Disengage:* ... turn back to its original orientation.
168167
- *ChangeSize.Disengage:* ... decrease back to its original size.
169-
- *BlendOut.Disengage:* ... starts to blend out - If *'OnDwell()'* was triggered, blend out completely and destroy, otherwise back to its idle state.
168+
- *BlendOut.Disengage:* ... starts to blend out - If _OnDwell()_ was triggered, blend out completely and destroy, otherwise back to its idle state.
170169

171170
**Design Consideration:**
172171
The key to an enjoyable experience here is to carefully tune the speed of any of these behaviors to avoid causing discomfort by reacting to the user’s eye gaze too quickly all the time.
@@ -177,47 +176,38 @@ Otherwise this can quickly feel extremely overwhelming.
177176

178177
#### Example: Multimodal Gaze-Supported Target Selection
179178
One event provided by the [EyeTrackingTarget](/Assets/MixedRealityToolkit.SDK/Features/Input/Handlers/EyeTrackingTarget.cs), yet not used by the
180-
*'Attentive Notifications'* is the *OnSelected()* event.
181-
Using the *EyeTrackingTarget*, you can specify what triggers the selection which will invoke the *'OnSelected()'* event.
179+
_'Attentive Notifications'_ is the _OnSelected()_ event.
180+
Using the _EyeTrackingTarget_, you can specify what triggers the selection which will invoke the _OnSelected()_ event.
182181
For example, the screenshot below is from
183182
[mrtk_eyes_02_TargetSelection.unity](/Assets/MixedRealityToolkit.Examples/Demos/EyeTracking/Scenes/mrtk_eyes_02_TargetSelection.unity).
184183
It shows how the [EyeTrackingTarget](/Assets/MixedRealityToolkit.SDK/Features/Input/Handlers/EyeTrackingTarget.cs)
185184
is set up for one of the gems that explodes when you select it.
186185

187186
![MRTK](/External/ReadMeImages/EyeTracking/mrtk_et_EyeTrackingTarget.jpg =750x450)
188187

189-
The *'OnSelected()'* event triggers the method *'TargetSelected'* in the
188+
The _OnSelected()_ event triggers the method _'TargetSelected'_ in the
190189
[HitBehavior_DestroyOnSelect](\Assets\MixedRealityToolkit.Examples\Demos\EyeTracking\Demo_TargetSelections\Scripts\HitBehavior_DestroyOnSelect.cs)
191190
script attached to the gem GameObject.
192-
The interesting part is *how* the selection is triggered.
191+
The interesting part is _how_ the selection is triggered.
193192
The [EyeTrackingTarget](/Assets/MixedRealityToolkit.SDK/Features/Input/Handlers/EyeTrackingTarget.cs)
194193
allows for quickly assigning different ways to invoke a selection:
195194

196-
- *Pinch gesture*: Setting the 'Select Action' to 'Select' uses the default hand gesture to trigger the selection. This means that the user can simply raise their hand and pinch their thumb and index finger together to confirm the selection.
195+
- _Pinch gesture_: Setting the 'Select Action' to 'Select' uses the default hand gesture to trigger the selection. This means that the user can simply raise their hand and pinch their thumb and index finger together to confirm the selection.
197196

198-
- Say *"Select"*: Use the default voice command *"Select"* for selecting a hologram.
197+
- Say _"Select"_: Use the default voice command _"Select"_ for selecting a hologram.
199198

200-
- Say *"Explode"* or *"Pop"*: To use custom voice commands, you need to follow two steps:
201-
1. Set up a custom action such as *"DestroyTarget"*
202-
- Navigate to *'Input System Profile'* -> *'Input Actions Profile'*
199+
- Say _"Explode"_ or _"Pop"_: To use custom voice commands, you need to follow two steps:
200+
1. Set up a custom action such as _"DestroyTarget"_
201+
- Navigate to _'Input System Profile'_ -> _'Input Actions Profile'_
203202
- Add new action
204203

205-
2. Set up the voice commands that trigger this action such as *"Explode"* or *"Pop"*
206-
- Navigate to *'Input System Profile'* -> *'Speech Commands Profile'*
204+
2. Set up the voice commands that trigger this action such as _"Explode"_ or _"Pop"_
205+
- Navigate to _'Input System Profile'_ -> _'Speech Commands Profile'_
207206
- Add new speech command and associate the action you just created
208-
- Assign a *'KeyCode'* to allow for triggering the action via a button press
209-
210-
207+
- Assign a _'KeyCode'_ to allow for triggering the action via a button press
211208
<br>
212209

213210
This should get you started in accessing Eye Tracking data in your MRTK Unity app!
214211

215212
---
216213
[Back to "Eye Tracking in the MixedRealityToolkit"](/Documentation/EyeTracking/EyeTracking_Main.md)
217-
218-
219-
220-
221-
222-
223-

0 commit comments

Comments
 (0)