You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Joint prefabs are visualized using simple prefabs. The _Palm_ and _Index Finger_ joints are of special importance and have their own prefab, while all other joints share the same prefab.
12
12
@@ -17,7 +17,7 @@ By default the hand joint prefabs are simple geometric primitives. These can be
The hand mesh is used if fully defined mesh data is provided by the hand tracking device. The mesh renderable in the prefab is replaced by data from the device, so a dummy mesh such as a cube is sufficient. The material of the prefab is used for the hand mesh.
23
23
@@ -47,11 +47,11 @@ Available joints are listed in the [`TrackedHandJoint`](xref:Microsoft.MixedReal
47
47
> [!NOTE]
48
48
> Joint object are destroyed when hand tracking is lost! Make sure that any scripts using the joint object handle the `null` case gracefully to avoid errors!
49
49
50
-
### Accessing a given Hand Controller
50
+
### Accessing a given hand controller
51
51
52
52
A specific hand controller is often available, e.g. when handling input events. In this case the joint data can be requested directly from the device, using the [`IMixedRealityHand`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityHand) interface.
53
53
54
-
#### Polling Joint Pose from Controller
54
+
#### Polling joint pose from controller
55
55
56
56
The [`TryGetJoint`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityHand.TryGetJoint*) function returns `false` if the requested joint is not available for some reason. In that case the resulting pose will be [`MixedRealityPose.ZeroIdentity`](xref:Microsoft.MixedReality.Toolkit.Utilities.MixedRealityPose.ZeroIdentity).
57
57
@@ -69,7 +69,7 @@ public void OnSourceDetected(SourceStateEventData eventData)
Copy file name to clipboardExpand all lines: Documentation/Input/InputActions.md
+8-8Lines changed: 8 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,8 +1,8 @@
1
-
# Input Actions
1
+
# Input actions
2
2
3
3
[**Input Actions**](InputActions.md) are abstractions over raw inputs meant to help isolating application logic from the specific input sources producing an input. It can be useful, for example, to define a *Select* action and map it to the left mouse button, a button in a gamepad and a trigger in a 6 DOF controller. You can then have your application logic listen for *Select* input action events instead of having to be aware of all the different inputs that can produce it.
4
4
5
-
## Creating An Input Action
5
+
## Creating an input action
6
6
7
7
Input actions are configured in the **Input Actions Profile**, inside the *Input System Profile* in the Mixed Reality Toolkit component, specifying a name for the action and the type of inputs (*Axis Constraint*) it can be mapped to:
8
8
@@ -19,11 +19,11 @@ Six Dof | 3D pose with translation and rotation like the one produced by 6 DOF c
19
19
20
20
You can find the full list in [`AxisType`](xref:Microsoft.MixedReality.Toolkit.Utilities.AxisType).
21
21
22
-
## Mapping Inputs To Actions
22
+
## Mapping input to actions
23
23
24
24
The way you map an input to and action depends on the type of the input source:
25
25
26
-
### Controller Inputs
26
+
### Controller input
27
27
28
28
Go to the **Controller Input Mapping Profile**, under the *Input System Profile*. There you will find a list of all supported controllers:
29
29
@@ -33,22 +33,22 @@ Select the one you want to configure and a dialog window will appear with all th
In the **Speech Command Profile**, under the *Input System Profile*, you'll find the list of currently defined speech commands. To map one of them to an action, just select it in the *Action* drop down.
The **Gestures Profile**, under the *Input System Profile*, contains all defined gestures. You can map each of them to an action by selecting it in the *Action* drop down.
> Currently only input actions of *Digital* type can be handled using the methods described in this section. For other action types, you'll have to handle directly the events for the corresponding inputs instead. For example, to handle a 6 DOF action mapped to controller inputs, you'll have to use [`IMixedRealityGestureHandler<T>`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityGestureHandler`1) with T = [`MixedRealityPose`](xref:Microsoft.MixedReality.Toolkit.Utilities.MixedRealityPose).
51
+
> Currently only input actions of *Digital* type can be handled using the methods described in this section. For other action types, you'll have to handle directly the events for the corresponding inputs instead. For example, to handle a 6 DOF action mapped to controller inputs, you'll have to use [`IMixedRealityGestureHandler<T>`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityGestureHandler`1) with T = [`MixedRealityPose`](xref:Microsoft.MixedReality.Toolkit.Utilities.MixedRealityPose).
52
52
53
53
The easiest way to handle input actions is to make use of the [`InputActionHandler`](xref:Microsoft.MixedReality.Toolkit.Input.InputActionHandler) script. This allows you to define the action you want to listen to and react to action started and ended events using Unity Events.
Copy file name to clipboardExpand all lines: Documentation/Input/InputState.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,4 +1,4 @@
1
-
# Accessing Input State in MRTK
1
+
# Accessing input state in MRTK
2
2
3
3
It's possible to directly query the state of all inputs in MRTK by iterating over the controllers attached to the input sources. MRTK also provides convenience methods for accessing the position and rotation of the eyes, hands, head, and motion controller.
4
4
@@ -49,7 +49,7 @@ foreach(var controller in CoreServices.InputSystem.DetectedControllers)
Copy file name to clipboardExpand all lines: Documentation/Input/Overview.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,4 +1,4 @@
1
-
# Input Overview
1
+
# Input overview
2
2
3
3
The Input System in MRTK allows you to:
4
4
@@ -22,4 +22,4 @@ Controllers can have [**Pointers**](Pointers.md) attached to them that query the
22
22
23
23
While you can handle [input events directly in UI components](InputEvents.md), it is recommended to use [pointer events](pointers.md#pointer-event-interfaces) to keep the implementation device-independent.
24
24
25
-
MRTK also provides several convenience methods to query input state directly in a device-independent way. See [Accessing Input State in MRTK](InputState.md) for more details.
25
+
MRTK also provides several convenience methods to query input state directly in a device-independent way. See [Accessing input state in MRTK](InputState.md) for more details.
The following table details the pointer types that are typically used for the common platforms in MRTK. NOTE:
120
120
it's possible to add different pointer types to these platforms. For example, you could add a Poke pointer or Sphere pointer to VR. Additionally, VR devices with a gamepad could use the GGV pointer.
@@ -196,7 +196,7 @@ public class ColorTap : MonoBehaviour, IMixedRealityFocusHandler, IMixedRealityP
196
196
}
197
197
```
198
198
199
-
### Query Pointers
199
+
### Query pointers
200
200
201
201
It is possible to gather all pointers currently active by looping through the available input sources (i.e controllers and inputs available) to discover which pointers are attached to them.
202
202
@@ -216,7 +216,7 @@ foreach (var inputSource in CoreServices.InputSystem.DetectedInputSources)
216
216
}
217
217
```
218
218
219
-
#### Primary Pointer
219
+
#### Primary pointer
220
220
221
221
Developers can subscribe to the FocusProviders PrimaryPointerChanged event to be notified when the primary pointer in focus has changed. This can be extremely useful to identify if the user is currently interacting with a scene via gaze or a hand ray or other input source.
222
222
@@ -246,7 +246,7 @@ The [PrimaryPointerExample scene](https://github.com/microsoft/MixedRealityToolk
The pointer [`Result`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityPointer.Result) property contains the current result for the scene query used to determine the object with focus. For a raycast pointer, like the ones created by default for motion controllers, gaze input and hand rays, it will contain the location and normal of the raycast hit.
252
252
@@ -264,7 +264,7 @@ The [PointerResultExample scene](https://github.com/microsoft/MixedRealityToolki
To turn enable and disable pointers (for example, to disable the hand ray), set the [`PointerBehavior`](xref:Microsoft.MixedReality.Toolkit.Input.PointerBehavior) for a given pointer type via [`PointerUtils`](xref:Microsoft.MixedReality.Toolkit.Input.PointerUtils).
270
270
@@ -297,7 +297,7 @@ For pointer events handled by [`IMixedRealityPointerHandler`](xref:Microsoft.Mix
The [**`Speech Input Handler`**](xref:Microsoft.MixedReality.Toolkit.Input.SpeechInputHandler) script can be added to a GameObject to handle speech commands using [**UnityEvents**](https://docs.unity3d.com/Manual/UnityEvents.html). It automatically shows the list of the defined keywords from the **Speech Commands Profile**.
Alternatively, developers can implement the [`IMixedRealitySpeechHandler`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealitySpeechHandler) interface in a custom script component to [handle speech input events](InputEvents.md#input-event-interface-example).
22
22
23
-
## Example Scene
23
+
## Example scene
24
24
25
25
The **SpeechInputExample** scene, in `MixedRealityToolkit.Examples\Demos\Input\Scenes\Speech`, shows how to use speech. You can also listen to speech command events directly in your own script by implementing [`IMixedRealitySpeechHandler`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealitySpeechHandler) (see table of [event handlers](InputEvents.md)).
Copy file name to clipboardExpand all lines: Documentation/InputSimulation/InputAnimationFileFormat.md
+9-9Lines changed: 9 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,4 +1,4 @@
1
-
# Input Animation Binary File Format Specification
1
+
# Input animation binary file format specification
2
2
3
3
## Overall structure
4
4
@@ -29,7 +29,7 @@ The input animation data consists of a sequence of animation curves. The number
29
29
| Hand Joints Left |[Joint Pose Curves](#joint-pose-curves)|
30
30
| Hand Joints Right |[Joint Pose Curves](#joint-pose-curves)|
31
31
32
-
### Joint Pose Curves
32
+
### Joint pose curves
33
33
34
34
For each hand a sequence of joint animation curves is stored. The number of joints is fixed, and a set of pose curves is stored for each joint.
35
35
@@ -63,7 +63,7 @@ For each hand a sequence of joint animation curves is stored. The number of join
63
63
| PinkyDistalJoint |[Pose Curves](#pose-curves)|
64
64
| PinkyTip |[Pose Curves](#pose-curves)|
65
65
66
-
### Pose Curves
66
+
### Pose curves
67
67
68
68
Pose curves are a sequence of 3 animation curves for the position vector, followed by 4 animation curves for the rotation quaternion.
69
69
@@ -77,7 +77,7 @@ Pose curves are a sequence of 3 animation curves for the position vector, follow
77
77
| Rotation Z |[Float Curve](#float-curve)|
78
78
| Rotation W |[Float Curve](#float-curve)|
79
79
80
-
### Float Curve
80
+
### Float curve
81
81
82
82
Floating point curves are fully fledged Bézier curves with a variable number of keyframes. Each keyframe stores a time and a curve value, as well as tangents and weights on the left and right side of each keyframe.
83
83
@@ -88,7 +88,7 @@ Floating point curves are fully fledged Bézier curves with a variable number of
88
88
| Number of keyframes | Int32 |
89
89
| Keyframes |[Float Keyframe](#float-keyframe)|
90
90
91
-
### Float Keyframe
91
+
### Float keyframe
92
92
93
93
A float keyframe stores tangent and weight values alongside the basic time and value.
94
94
@@ -102,7 +102,7 @@ A float keyframe stores tangent and weight values alongside the basic time and v
@@ -122,7 +122,7 @@ A boolean keyframe only stores a time and value.
122
122
| Time | Float32 |
123
123
| Value | Float32 |
124
124
125
-
### Wrap Mode
125
+
### Wrap mode
126
126
127
127
The semantics of Pre- and Post-Wrap modes follow the [Unity WrapMode](https://docs.unity3d.com/ScriptReference/WrapMode.html) definition. They are a combination of the following bits:
128
128
@@ -134,7 +134,7 @@ The semantics of Pre- and Post-Wrap modes follow the [Unity WrapMode](https://do
134
134
| 4 | PingPong: When time reaches the end of the animation clip, time will ping pong back between beginning and end. |
135
135
| 8 | ClampForever: Plays back the animation. When it reaches the end, it will keep playing the last frame and never stop playing. |
136
136
137
-
### Weighted Mode
137
+
### Weighted mode
138
138
139
139
The semantics of the Weighted mode follow the [Unity WeightedMode](https://docs.unity3d.com/ScriptReference/WeightedMode.html) definition.
Copy file name to clipboardExpand all lines: Documentation/InputSimulation/InputAnimationRecording.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,4 +1,4 @@
1
-
# Input Animation Recording
1
+
# Input animation recording
2
2
3
3
MRTK features an recording system by which head movement and hand tracking data can be stored in animation files. The recorded data can then be played back using the [input simulation system](InputSimulationService.md).
4
4
@@ -12,11 +12,11 @@ Recording input is a useful tool in a variety of situations:
12
12
The recording system supports a "rolling buffer" concept that allows recording recent input in the background.
13
13
See [Input Recording Service](#input-recording-service).
14
14
15
-
## Recording and Playback services
15
+
## Recording and playback services
16
16
17
17
Two input system services are provided to record and play back input respectively.
18
18
19
-
### Input Recording Service
19
+
### Input recording service
20
20
21
21
[`InputRecordingService`](xref:Microsoft.MixedReality.Toolkit.Input.InputRecordingService) takes data from the main camera transform and active hand controllers and stores it in an internal buffer. When requested this data is then serialized into binary files for storage and later replay.
22
22
@@ -32,7 +32,7 @@ The data in the recording buffer can be saved in a binary file using the [SaveIn
32
32
33
33
For details on the binary file format see [Input Animation File Format Specification](InputAnimationFileFormat.md).
34
34
35
-
### Input Playback Service
35
+
### Input playback service
36
36
37
37
[`InputPlaybackService`](xref:Microsoft.MixedReality.Toolkit.Input.InputPlaybackService) reads a binary file with input animation data and then applies this data through the [InputSimulationService](xref:Microsoft.MixedReality.Toolkit.Input.InputSimulationService) to recreate the recorded movements.
0 commit comments