Skip to content

Commit c5a7bab

Browse files
Added input pages to documentation TOC. Reworked introduction.
1 parent 1d3a79a commit c5a7bab

File tree

8 files changed

+36
-43
lines changed

8 files changed

+36
-43
lines changed

Documentation/Input/Controllers.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# Controllers
22

3-
Each controller type has a number of *physical inputs*. Each of these is defined by an *Axis Type*, telling us the data type of the input value (Digital, Single Axis, Dual Axis, Six Dof, ...), and an *Input Type* (Button Press, Trigger, Thumb Stick, Spatial Pointer, ...) describing the origin of the input. Physical inputs are mapped to *input actions* via *interaction mappings* in *Controller Input Mapping Profile*.
3+
Controllers are created and destroyed automatically by [**input providers**](InputProviders.md). Each controller type has a number of *physical inputs* defined by an *axis type*, telling us the data type of the input value (Digital, Single Axis, Dual Axis, Six Dof, ...), and an *input type* (Button Press, Trigger, Thumb Stick, Spatial Pointer, ...) describing the origin of the input. Physical inputs are mapped to *input actions* via in the **Controller Input Mapping Profile**, under the *Input System Profile* in the Mixed Reality Toolkit component.
44

55
<img src="../../External/ReadMeImages/Input/ControllerInputMapping.png" style="max-width:100%;">
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# Input Actions
22

3-
Input actions can be configured in the *Input Actions Profile* specifying a name for the action and the *Axis Type* of the physical inputs it can be mapped to. The input action mapped to a physical input, if any, is stored in *input events*.
3+
Input actions can be configured in the **Input Actions Profile**, inside the *Input System Profile* in the Mixed Reality Toolkit component, specifying a name for the action and the *Axis Type* of the physical inputs it can be mapped to. The input action mapped to a physical input, if any, is included in *input events*.
44

55
<img src="../../External/ReadMeImages/Input/InputActions.png" style="max-width:100%;">

Documentation/Input/InputEvents.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -12,5 +12,7 @@ Handler | Events | Description
1212
[`IMixedRealityDictationHandler`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityDictationHandler) | Dictation Hypothesis / Result / Complete / Error | Raised by dictation systems to report the results of a dictation session.
1313
[`IMixedRealityGestureHandler`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityGestureHandler) | Gesture Started / Updated / Completed / Canceled | Raised on gesture detection.
1414
[`IMixedRealityGestureHandler<T>`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityGestureHandler`1) | Gesture Updated / Completed | Raised on detection of gestures containing additional data of the given type. See the [**Gestures**](Gestures.md) page for details on possible values for **T**.
15+
[`IMixedRealityHandJointHandler`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityHandJointHandler) | Hand Joints Updated | Raised by articulated hand controllers when hand joints are updated.
16+
[`IMixedRealityHandMeshHandler`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityHandMeshHandler) | Hand Mesh Updated | Raised by articulated hand controllers when a hand mesh is updated.
1517

1618
By default a script will receive events only while in focus by a pointer. To receive events while out of focus, register the script's game object as a global listener via [`MixedRealityToolkit.InputSystem.Register`](xref:Microsoft.MixedReality.Toolkit.IMixedRealityEventSystem) or derive the script from [`InputSystemGlobalListener`](xref:Microsoft.MixedReality.Toolkit.Input.InputSystemGlobalListener).

Documentation/Input/InputProviders.md

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,10 @@
11
# Input Providers
22

3-
We currently support the following input providers with their corresponding controllers:
3+
Input providers are registered in the **Registered Service Providers Profile**, found in the Mixed Reality Toolkit component:
4+
5+
<img src="../../External/ReadMeImages/Input/RegisteredServiceProviders.png" style="max-width:100%;">
6+
7+
These are the input providers available out of the box, together with their corresponding controllers:
48

59
Input Provider | Controllers
610
--- | ---
@@ -15,6 +19,4 @@ Windows Speech Input Provider | *
1519

1620
\* Dictation and Speech providers don't create any controllers, they raise their own specialized input events directly.
1721

18-
<img src="../../External/ReadMeImages/Input/RegisteredServiceProviders.png" style="max-width:100%;">
19-
20-
<sup>Registered Service Providers Profile. Found in the Mixed Reality Toolkit component, is the place to go to add or remove input providers.</sup>
22+
Custom input providers can be created implementing the [`IMixedRealityDeviceManager`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityDeviceManager) interface.
Lines changed: 4 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,24 +1,18 @@
1-
# Introduction
1+
# Input Overview
22

33
The Input System in MRTK allows you to:
44
- Consume inputs from a variety of input sources, like 6 DOF controllers, articulated hands or speech, via input events.
55
- Define abstract actions, like *Select* or *Menu*, and associate them to different inputs.
66
- Setup pointers attached to controllers to drive UI components via focus and pointer events.
77

8-
In this section we will introduce the main MRTK concepts you need to understand to make use of this functionality.
9-
108
Inputs are produced by [**Input Providers**](InputProviders.md). Each provider corresponds to a particular source of input: Open VR, Windows Mixed Reality (WMR), Unity Joystick, Windows Speech, etc. Providers are added to your project via the **Registered Service Providers Profile** in the *Mixed Reality Toolkit* component and will produce [**Input Events**](InputEvents.md) automatically when the corresponding input sources are available, e.g. when a WMR controller is detected or a gamepad connected.
119

12-
[**Input Actions**](InputActions.md) are abstractions over raw inputs meant to help isolating application logic from the specific input sources producing an input. For example, it can be useful to define a *Select* action and map it to the left mouse button, a button in a gamepad and a trigger in a 6 DOF controller. You can then have your application logic listen for input events mapped to that action instead of having to be aware of all the different inputs that can produce it. Input Actions are defined in the **Input Actions Profile**, found within the *Input System Profile* in the *Mixed Reality Toolkit* component.
10+
[**Input Actions**](InputActions.md) are abstractions over raw inputs meant to help isolating application logic from the specific input sources producing an input. It can be useful, for example, to define a *Select* action and map it to the left mouse button, a button in a gamepad and a trigger in a 6 DOF controller. You can then have your application logic listen for input events mapped to that action instead of having to be aware of all the different input sources that can produce it. Input Actions are defined in the **Input Actions Profile**, found within the *Input System Profile* in the *Mixed Reality Toolkit* component.
1311

14-
[**Controllers**](Controllers.md) are created by *input providers* as input devices are detected and destroyed when they're lost or disconnected. The WMR input provider, for example, will create *WMR controllers* for 6 DOF devices and *WMR articulated hand controllers* for articulated hands. Controller inputs can be mapped to input actions via the **Controller Mapping Profile**, inside the *Input System Profile*. Inputs events raised by the controller will include the associated input action, if any.
12+
[**Controllers**](Controllers.md) are created by *input providers* when input devices are detected and destroyed when they're lost or disconnected. The WMR input provider, for example, will create *WMR controllers* for 6 DOF devices and *WMR articulated hand controllers* for articulated hands. Controller inputs can be mapped to input actions via the **Controller Mapping Profile**, inside the *Input System Profile*. Inputs events raised by controllers will include the associated input action, if any.
1513

1614
Controllers can have [**Pointers**](Pointers.md) attached to them that query the scene to determine the game object with focus and raise [**Pointer Events**](Pointers.md#pointer-events) on it. As an example, our *line pointer* performs a raycast against the scene using the controller pose to compute the origin and direction of the ray. The pointers created for each controller are set up in the **Pointer Profile**, under the *Input System Profile*.
1715

1816
<img src="../../External/ReadMeImages/Input/EventFlow.png" style="display:block;margin-left:auto;margin-right:auto;">
1917

20-
<sup>Event flow. While you can handle input events directly in UI components it is recommended to use just pointer events to keep the logic device-independent.</sup>
21-
22-
Additional forms of input are available through [**Gestures**](Gestures.md), [**Speech**](Speech.md) and [**Dictation**](Dictation.md).
23-
24-
**Advanced Topics** : Controller Visualization, Gaze Provider, Focus Provider, Action Rules, Input System
18+
<sup>Event flow. While you can handle input events directly in UI components it is recommended to use pointer events to keep the implementation device-independent.</sup>

Documentation/Input/Pointers.md

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,11 @@
11
# Pointers
22

3-
Pointers are instanced automatically at runtime when a new controller is detected. You can have more than one pointer attached to a controller; for example, with the default pointer profile, WMR controllers get both a line and a parabolic pointer for normal selection and teleportation respectively. Pointers communicate with each other to decide which one is active.
4-
5-
MRTK provides a set of pointer prefabs in *Assets/MixedRealityToolkit.SDK/Features/UX/Prefabs/Pointers*. You can use your own prefabs as long as they contain one of the pointer scripts in *Assets/MixedRealityToolkit.SDK/Features/UX/Scripts/Pointers* or any other script implementing [`IMixedRealityPointer`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityPointer).
3+
Pointers are instanced automatically at runtime when a new controller is detected. You can have more than one pointer attached to a controller; for example, with the default pointer profile, WMR controllers get both a line and a parabolic pointer for normal selection and teleportation respectively. Pointers communicate with each other to decide which one is active. The pointers created for each controller are set up in the **Pointer Profile**, under the *Input System Profile*.
64

75
<img src="../../External/ReadMeImages/Input/PointerProfile.png" style="max-width:100%;">
86

7+
MRTK provides a set of pointer prefabs in *Assets/MixedRealityToolkit.SDK/Features/UX/Prefabs/Pointers*. You can use your own prefabs as long as they contain one of the pointer scripts in *Assets/MixedRealityToolkit.SDK/Features/UX/Scripts/Pointers* or any other script implementing [`IMixedRealityPointer`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityPointer).
8+
99
## Pointer Events
1010

1111
To receive pointer events, implement one of the following interfaces in your script:
@@ -14,4 +14,5 @@ Handler | Events | Description
1414
--- | --- | ---
1515
[`IMixedRealityFocusChangedHandler`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityFocusChangedHandler) | Before Focus Changed / Focus Changed | Raised on both the game object losing focus and the one gaining it every time a pointer changes focus.
1616
[`IMixedRealityFocusHandler`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityFocusHandler) | Focus Enter / Exit | Raised on the game object gaining focus when the first pointer enters it and on the one losing focus when the last pointer leaves it.
17-
[`IMixedRealityPointerHandler`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityPointerHandler) | Pointer Down / Up / Clicked | Raised to report pointer input.
17+
[`IMixedRealityPointerHandler`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityPointerHandler) | Pointer Down / Up / Clicked | Raised to report pointer input.
18+
[`IMixedRealityTouchHandler`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityTouchHandler) | Touch Started / Updated / Completed | Raised by touch-aware pointers like the [**Poke Pointer**](xref:Microsoft.MixedReality.Toolkit.Input.PokePointer) to report touch activity.

Documentation/Input/toc.yml

Lines changed: 0 additions & 18 deletions
This file was deleted.

Documentation/toc.yml

Lines changed: 17 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -33,18 +33,30 @@
3333
- name: Service Provider
3434
href: MixedRealityConfigurationGuide.md
3535
- name: Input
36-
href: TODO.md
36+
href: Input/Overview.md
3737
items:
38+
- name: Input Providers
39+
href: Input/InputProviders.md
40+
- name: Input Events
41+
href: Input/InputEvents.md
42+
- name: Input Actions
43+
href: Input/InputActions.md
44+
- name: Controllers
45+
href: Input/Controllers.md
46+
- name: Pointers
47+
href: Input/Pointers.md
48+
- name: Gestures
49+
href: Input/Gestures.md
50+
- name: Speech
51+
href: Input/Speech.md
52+
- name: Dictation
53+
href: Input/Dictation.md
3854
- name: Hands
3955
href: TODO.md
40-
- name: Voice
41-
href: TODO.md
4256
- name: Eye
4357
href: TODO.md
4458
- name: Gaze
4559
href: TODO.md
46-
- name: Controllers
47-
href: TODO.md
4860
- name: Recommended Interaction Models
4961
href: TODO.md
5062
items:

0 commit comments

Comments
 (0)