|
1 | | -# Introduction |
| 1 | +# Input Overview |
2 | 2 |
|
3 | 3 | The Input System in MRTK allows you to: |
4 | 4 | - Consume inputs from a variety of input sources, like 6 DOF controllers, articulated hands or speech, via input events. |
5 | 5 | - Define abstract actions, like *Select* or *Menu*, and associate them to different inputs. |
6 | 6 | - Setup pointers attached to controllers to drive UI components via focus and pointer events. |
7 | 7 |
|
8 | | -In this section we will introduce the main MRTK concepts you need to understand to make use of this functionality. |
9 | | - |
10 | 8 | Inputs are produced by [**Input Providers**](InputProviders.md). Each provider corresponds to a particular source of input: Open VR, Windows Mixed Reality (WMR), Unity Joystick, Windows Speech, etc. Providers are added to your project via the **Registered Service Providers Profile** in the *Mixed Reality Toolkit* component and will produce [**Input Events**](InputEvents.md) automatically when the corresponding input sources are available, e.g. when a WMR controller is detected or a gamepad connected. |
11 | 9 |
|
12 | | -[**Input Actions**](InputActions.md) are abstractions over raw inputs meant to help isolating application logic from the specific input sources producing an input. For example, it can be useful to define a *Select* action and map it to the left mouse button, a button in a gamepad and a trigger in a 6 DOF controller. You can then have your application logic listen for input events mapped to that action instead of having to be aware of all the different inputs that can produce it. Input Actions are defined in the **Input Actions Profile**, found within the *Input System Profile* in the *Mixed Reality Toolkit* component. |
| 10 | +[**Input Actions**](InputActions.md) are abstractions over raw inputs meant to help isolating application logic from the specific input sources producing an input. It can be useful, for example, to define a *Select* action and map it to the left mouse button, a button in a gamepad and a trigger in a 6 DOF controller. You can then have your application logic listen for input events mapped to that action instead of having to be aware of all the different input sources that can produce it. Input Actions are defined in the **Input Actions Profile**, found within the *Input System Profile* in the *Mixed Reality Toolkit* component. |
13 | 11 |
|
14 | | -[**Controllers**](Controllers.md) are created by *input providers* as input devices are detected and destroyed when they're lost or disconnected. The WMR input provider, for example, will create *WMR controllers* for 6 DOF devices and *WMR articulated hand controllers* for articulated hands. Controller inputs can be mapped to input actions via the **Controller Mapping Profile**, inside the *Input System Profile*. Inputs events raised by the controller will include the associated input action, if any. |
| 12 | +[**Controllers**](Controllers.md) are created by *input providers* when input devices are detected and destroyed when they're lost or disconnected. The WMR input provider, for example, will create *WMR controllers* for 6 DOF devices and *WMR articulated hand controllers* for articulated hands. Controller inputs can be mapped to input actions via the **Controller Mapping Profile**, inside the *Input System Profile*. Inputs events raised by controllers will include the associated input action, if any. |
15 | 13 |
|
16 | 14 | Controllers can have [**Pointers**](Pointers.md) attached to them that query the scene to determine the game object with focus and raise [**Pointer Events**](Pointers.md#pointer-events) on it. As an example, our *line pointer* performs a raycast against the scene using the controller pose to compute the origin and direction of the ray. The pointers created for each controller are set up in the **Pointer Profile**, under the *Input System Profile*. |
17 | 15 |
|
18 | 16 | <img src="../../External/ReadMeImages/Input/EventFlow.png" style="display:block;margin-left:auto;margin-right:auto;"> |
19 | 17 |
|
20 | | -<sup>Event flow. While you can handle input events directly in UI components it is recommended to use just pointer events to keep the logic device-independent.</sup> |
21 | | - |
22 | | -Additional forms of input are available through [**Gestures**](Gestures.md), [**Speech**](Speech.md) and [**Dictation**](Dictation.md). |
23 | | - |
24 | | -**Advanced Topics** : Controller Visualization, Gaze Provider, Focus Provider, Action Rules, Input System |
| 18 | +<sup>Event flow. While you can handle input events directly in UI components it is recommended to use pointer events to keep the implementation device-independent.</sup> |
0 commit comments