Skip to content

Commit c0862b5

Browse files
author
David Kline
authored
Merge pull request #6899 from keveleigh/header-casing-3
Header capitalization phase 3
2 parents 9ea8a4c + 876bb3e commit c0862b5

23 files changed

+124
-124
lines changed

Documentation/Input/HandTracking.md

Lines changed: 13 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,12 @@
1-
# Hand Tracking
1+
# Hand tracking
22

3-
## Hand Tracking Profile
3+
## Hand tracking profile
44

55
The _Hand Tracking profile_ is found under the _Input System profile_. It contains settings for customizing hand representation.
66

77
<img src="../../Documentation/Images/Input/HandTrackingProfile.png" width="650px" style="display:block;">
88

9-
## Joint Prefabs
9+
## Joint prefabs
1010

1111
Joint prefabs are visualized using simple prefabs. The _Palm_ and _Index Finger_ joints are of special importance and have their own prefab, while all other joints share the same prefab.
1212

@@ -17,7 +17,7 @@ By default the hand joint prefabs are simple geometric primitives. These can be
1717
1818
<img src="../../Documentation/Images/InputSimulation/MRTK_Core_Input_Hands_JointVisualizerPrefabs.png" width="350px" style="display:block;">
1919

20-
## Hand Mesh Prefab
20+
## Hand mesh prefab
2121

2222
The hand mesh is used if fully defined mesh data is provided by the hand tracking device. The mesh renderable in the prefab is replaced by data from the device, so a dummy mesh such as a cube is sufficient. The material of the prefab is used for the hand mesh.
2323

@@ -47,11 +47,11 @@ Available joints are listed in the [`TrackedHandJoint`](xref:Microsoft.MixedReal
4747
> [!NOTE]
4848
> Joint object are destroyed when hand tracking is lost! Make sure that any scripts using the joint object handle the `null` case gracefully to avoid errors!
4949
50-
### Accessing a given Hand Controller
50+
### Accessing a given hand controller
5151

5252
A specific hand controller is often available, e.g. when handling input events. In this case the joint data can be requested directly from the device, using the [`IMixedRealityHand`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityHand) interface.
5353

54-
#### Polling Joint Pose from Controller
54+
#### Polling joint pose from controller
5555

5656
The [`TryGetJoint`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityHand.TryGetJoint*) function returns `false` if the requested joint is not available for some reason. In that case the resulting pose will be [`MixedRealityPose.ZeroIdentity`](xref:Microsoft.MixedReality.Toolkit.Utilities.MixedRealityPose.ZeroIdentity).
5757

@@ -69,7 +69,7 @@ public void OnSourceDetected(SourceStateEventData eventData)
6969
}
7070
```
7171

72-
#### Joint Transform from Hand Visualizer
72+
#### Joint transform from hand visualizer
7373

7474
Joint objects can be requested from the [controller visualizer](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityController.Visualizer).
7575

@@ -91,7 +91,7 @@ public void OnSourceDetected(SourceStateEventData eventData)
9191

9292
If no specific controller is given then utility classes are provided for convenient access to hand joint data. These functions request joint data from the first available hand device currently tracked.
9393

94-
#### Polling Joint Pose from HandJointUtils
94+
#### Polling joint pose from HandJointUtils
9595

9696
[`HandJointUtils`](xref:Microsoft.MixedReality.Toolkit.Input.HandJointUtils) is a static class that queries the first active hand device.
9797

@@ -102,7 +102,7 @@ if (HandJointUtils.TryGetJointPose(TrackedHandJoint.IndexTip, Handedness.Right,
102102
}
103103
```
104104

105-
#### Joint Transform from Hand Joint Service
105+
#### Joint transform from hand joint service
106106

107107
[`IMixedRealityHandJointService`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityHandJointService) keeps a persistent set of [GameObjects](https://docs.unity3d.com/ScriptReference/GameObject.html) for tracking joints.
108108
@@ -124,11 +124,11 @@ if (handJointService != null)
124124
}
125125
```
126126

127-
### Hand Tracking Events
127+
### Hand tracking events
128128

129129
The input system provides events as well, if polling data from controllers directly is not desirable.
130130

131-
#### Joint Events
131+
#### Joint events
132132

133133
[`IMixedRealityHandJointHandler`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityHandJointHandler) handles updates of joint positions.
134134

@@ -150,7 +150,7 @@ public class MyHandJointEventHandler : IMixedRealityHandJointHandler
150150
}
151151
```
152152

153-
#### Mesh Events
153+
#### Mesh events
154154

155155
[`IMixedRealityHandMeshHandler`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityHandMeshHandler) handles changes of the articulated hand mesh.
156156

@@ -181,7 +181,7 @@ public class MyHandMeshEventHandler : IMixedRealityHandMeshHandler
181181
}
182182
```
183183

184-
## Known Issues
184+
## Known issues
185185

186186
### .NET Native
187187

Documentation/Input/InputActions.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
1-
# Input Actions
1+
# Input actions
22

33
[**Input Actions**](InputActions.md) are abstractions over raw inputs meant to help isolating application logic from the specific input sources producing an input. It can be useful, for example, to define a *Select* action and map it to the left mouse button, a button in a gamepad and a trigger in a 6 DOF controller. You can then have your application logic listen for *Select* input action events instead of having to be aware of all the different inputs that can produce it.
44

5-
## Creating An Input Action
5+
## Creating an input action
66

77
Input actions are configured in the **Input Actions Profile**, inside the *Input System Profile* in the Mixed Reality Toolkit component, specifying a name for the action and the type of inputs (*Axis Constraint*) it can be mapped to:
88

@@ -19,11 +19,11 @@ Six Dof | 3D pose with translation and rotation like the one produced by 6 DOF c
1919

2020
You can find the full list in [`AxisType`](xref:Microsoft.MixedReality.Toolkit.Utilities.AxisType).
2121

22-
## Mapping Inputs To Actions
22+
## Mapping input to actions
2323

2424
The way you map an input to and action depends on the type of the input source:
2525

26-
### Controller Inputs
26+
### Controller input
2727

2828
Go to the **Controller Input Mapping Profile**, under the *Input System Profile*. There you will find a list of all supported controllers:
2929

@@ -33,22 +33,22 @@ Select the one you want to configure and a dialog window will appear with all th
3333

3434
<img src="../../Documentation/Images/Input/InputActionAssignment.PNG" style="max-width:100%;">
3535

36-
### Speech Inputs
36+
### Speech input
3737

3838
In the **Speech Command Profile**, under the *Input System Profile*, you'll find the list of currently defined speech commands. To map one of them to an action, just select it in the *Action* drop down.
3939

4040
<img src="../../Documentation/Images/Input/SpeechCommandsProfile.png" style="max-width:100%;">
4141

42-
### Gesture Inputs
42+
### Gesture input
4343

4444
The **Gestures Profile**, under the *Input System Profile*, contains all defined gestures. You can map each of them to an action by selecting it in the *Action* drop down.
4545

4646
<img src="../../Documentation/Images/Input/GestureProfile.png" style="max-width:100%;">
4747

48-
## Handling Input Actions
48+
## Handling input actions
4949

5050
> [!WARNING]
51-
> Currently only input actions of *Digital* type can be handled using the methods described in this section. For other action types, you'll have to handle directly the events for the corresponding inputs instead. For example, to handle a 6 DOF action mapped to controller inputs, you'll have to use [`IMixedRealityGestureHandler<T>`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityGestureHandler`1) with T = [`MixedRealityPose`](xref:Microsoft.MixedReality.Toolkit.Utilities.MixedRealityPose).
51+
> Currently only input actions of *Digital* type can be handled using the methods described in this section. For other action types, you'll have to handle directly the events for the corresponding inputs instead. For example, to handle a 6 DOF action mapped to controller inputs, you'll have to use [`IMixedRealityGestureHandler<T>`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityGestureHandler`1) with T = [`MixedRealityPose`](xref:Microsoft.MixedReality.Toolkit.Utilities.MixedRealityPose).
5252
5353
The easiest way to handle input actions is to make use of the [`InputActionHandler`](xref:Microsoft.MixedReality.Toolkit.Input.InputActionHandler) script. This allows you to define the action you want to listen to and react to action started and ended events using Unity Events.
5454

Documentation/Input/InputProviders.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Input Providers
1+
# Input providers
22

33
Input providers are registered in the **Registered Service Providers Profile**, found in the Mixed Reality Toolkit component:
44

Documentation/Input/InputState.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Accessing Input State in MRTK
1+
# Accessing input state in MRTK
22

33
It's possible to directly query the state of all inputs in MRTK by iterating over the controllers attached to the input sources. MRTK also provides convenience methods for accessing the position and rotation of the eyes, hands, head, and motion controller.
44

@@ -49,7 +49,7 @@ foreach(var controller in CoreServices.InputSystem.DetectedControllers)
4949
}
5050
```
5151

52-
## See Also
52+
## See also
5353

5454
- [InputEvents](InputEvents.md)
5555
- [Pointers](Pointers.md)

Documentation/Input/Overview.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Input Overview
1+
# Input overview
22

33
The Input System in MRTK allows you to:
44

@@ -22,4 +22,4 @@ Controllers can have [**Pointers**](Pointers.md) attached to them that query the
2222

2323
While you can handle [input events directly in UI components](InputEvents.md), it is recommended to use [pointer events](pointers.md#pointer-event-interfaces) to keep the implementation device-independent.
2424

25-
MRTK also provides several convenience methods to query input state directly in a device-independent way. See [Accessing Input State in MRTK](InputState.md) for more details.
25+
MRTK also provides several convenience methods to query input state directly in a device-independent way. See [Accessing input state in MRTK](InputState.md) for more details.

Documentation/Input/Pointers.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -114,7 +114,7 @@ Useful Sphere Pointer properties:
114114

115115
<img src="../../Documentation/Images/Pointers/MRTK_Pointers_Parabolic.png" width="400">
116116

117-
## Pointer support for Mixed Reality Platforms
117+
## Pointer support for mixed reality platforms
118118

119119
The following table details the pointer types that are typically used for the common platforms in MRTK. NOTE:
120120
it's possible to add different pointer types to these platforms. For example, you could add a Poke pointer or Sphere pointer to VR. Additionally, VR devices with a gamepad could use the GGV pointer.
@@ -196,7 +196,7 @@ public class ColorTap : MonoBehaviour, IMixedRealityFocusHandler, IMixedRealityP
196196
}
197197
```
198198

199-
### Query Pointers
199+
### Query pointers
200200

201201
It is possible to gather all pointers currently active by looping through the available input sources (i.e controllers and inputs available) to discover which pointers are attached to them.
202202

@@ -216,7 +216,7 @@ foreach (var inputSource in CoreServices.InputSystem.DetectedInputSources)
216216
}
217217
```
218218

219-
#### Primary Pointer
219+
#### Primary pointer
220220

221221
Developers can subscribe to the FocusProviders PrimaryPointerChanged event to be notified when the primary pointer in focus has changed. This can be extremely useful to identify if the user is currently interacting with a scene via gaze or a hand ray or other input source.
222222

@@ -246,7 +246,7 @@ The [PrimaryPointerExample scene](https://github.com/microsoft/MixedRealityToolk
246246

247247
<img src="../../Documentation/Images/Pointers/PrimaryPointerExample.png" style="max-width:100%;">
248248

249-
### Pointer Result
249+
### Pointer result
250250

251251
The pointer [`Result`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityPointer.Result) property contains the current result for the scene query used to determine the object with focus. For a raycast pointer, like the ones created by default for motion controllers, gaze input and hand rays, it will contain the location and normal of the raycast hit.
252252

@@ -264,7 +264,7 @@ The [PointerResultExample scene](https://github.com/microsoft/MixedRealityToolki
264264

265265
<img src="../../Documentation/Images/Input/PointerResultExample.png" style="max-width:100%;">
266266

267-
### Disable Pointers
267+
### Disable pointers
268268

269269
To turn enable and disable pointers (for example, to disable the hand ray), set the [`PointerBehavior`](xref:Microsoft.MixedReality.Toolkit.Input.PointerBehavior) for a given pointer type via [`PointerUtils`](xref:Microsoft.MixedReality.Toolkit.Input.PointerUtils).
270270

@@ -297,7 +297,7 @@ For pointer events handled by [`IMixedRealityPointerHandler`](xref:Microsoft.Mix
297297

298298
<img src="../../Documentation/Images/Pointers/PointerHandler.png" style="max-width:100%;">
299299

300-
## Pointer Extent
300+
## Pointer extent
301301

302302
Far pointers have settings which limit how far they will raycast and interact with other objects in the scene.
303303
By default, this value is set to 10 meters. This value was chosen to remain consistent with the behavior
@@ -312,7 +312,7 @@ fields:
312312
*Default Pointer Extent* - This controls the length of the pointer ray/line that will
313313
render when the pointer is not interacting with anything.
314314

315-
## See Also
315+
## See also
316316

317317
- [Pointer Architecture](../Architecture/InputSystem/ControllersPointersAndFocus.md)
318318
- [Input Events](InputEvents.md)

Documentation/Input/Speech.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ Speech input providers, like *Windows Speech Input*, don't create any controller
88

99
<img src="../../Documentation/Images/Input/SpeechCommandsProfile.png" width="450px">
1010

11-
## Handling Speech Input
11+
## Handling speech input
1212

1313
The [**`Speech Input Handler`**](xref:Microsoft.MixedReality.Toolkit.Input.SpeechInputHandler) script can be added to a GameObject to handle speech commands using [**UnityEvents**](https://docs.unity3d.com/Manual/UnityEvents.html). It automatically shows the list of the defined keywords from the **Speech Commands Profile**.
1414

@@ -20,7 +20,7 @@ Assign optional **SpeechConfirmationTooltip.prefab** to display animated confirm
2020

2121
Alternatively, developers can implement the [`IMixedRealitySpeechHandler`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealitySpeechHandler) interface in a custom script component to [handle speech input events](InputEvents.md#input-event-interface-example).
2222

23-
## Example Scene
23+
## Example scene
2424

2525
The **SpeechInputExample** scene, in `MixedRealityToolkit.Examples\Demos\Input\Scenes\Speech`, shows how to use speech. You can also listen to speech command events directly in your own script by implementing [`IMixedRealitySpeechHandler`](xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealitySpeechHandler) (see table of [event handlers](InputEvents.md)).
2626

Documentation/InputSimulation/InputAnimationFileFormat.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Input Animation Binary File Format Specification
1+
# Input animation binary file format specification
22

33
## Overall structure
44

@@ -29,7 +29,7 @@ The input animation data consists of a sequence of animation curves. The number
2929
| Hand Joints Left | [Joint Pose Curves](#joint-pose-curves) |
3030
| Hand Joints Right | [Joint Pose Curves](#joint-pose-curves) |
3131

32-
### Joint Pose Curves
32+
### Joint pose curves
3333

3434
For each hand a sequence of joint animation curves is stored. The number of joints is fixed, and a set of pose curves is stored for each joint.
3535

@@ -63,7 +63,7 @@ For each hand a sequence of joint animation curves is stored. The number of join
6363
| PinkyDistalJoint | [Pose Curves](#pose-curves) |
6464
| PinkyTip | [Pose Curves](#pose-curves) |
6565

66-
### Pose Curves
66+
### Pose curves
6767

6868
Pose curves are a sequence of 3 animation curves for the position vector, followed by 4 animation curves for the rotation quaternion.
6969

@@ -77,7 +77,7 @@ Pose curves are a sequence of 3 animation curves for the position vector, follow
7777
| Rotation Z | [Float Curve](#float-curve) |
7878
| Rotation W | [Float Curve](#float-curve) |
7979

80-
### Float Curve
80+
### Float curve
8181

8282
Floating point curves are fully fledged Bézier curves with a variable number of keyframes. Each keyframe stores a time and a curve value, as well as tangents and weights on the left and right side of each keyframe.
8383

@@ -88,7 +88,7 @@ Floating point curves are fully fledged Bézier curves with a variable number of
8888
| Number of keyframes | Int32 |
8989
| Keyframes | [Float Keyframe](#float-keyframe) |
9090

91-
### Float Keyframe
91+
### Float keyframe
9292

9393
A float keyframe stores tangent and weight values alongside the basic time and value.
9494

@@ -102,7 +102,7 @@ A float keyframe stores tangent and weight values alongside the basic time and v
102102
| OutWeight | Float32 |
103103
| WeightedMode | Int32, [Weighted Mode](#weighted-mode) |
104104

105-
### Boolean Curve
105+
### Boolean curve
106106

107107
Boolean curves are simple sequences of on/off values. On every keyframe the value of the curve flips immediately.
108108

@@ -113,7 +113,7 @@ Boolean curves are simple sequences of on/off values. On every keyframe the valu
113113
| Number of keyframes | Int32 |
114114
| Keyframes | [Boolean Keyframe](#boolean-keyframe) |
115115

116-
### Boolean Keyframe
116+
### Boolean keyframe
117117

118118
A boolean keyframe only stores a time and value.
119119

@@ -122,7 +122,7 @@ A boolean keyframe only stores a time and value.
122122
| Time | Float32 |
123123
| Value | Float32 |
124124

125-
### Wrap Mode
125+
### Wrap mode
126126

127127
The semantics of Pre- and Post-Wrap modes follow the [Unity WrapMode](https://docs.unity3d.com/ScriptReference/WrapMode.html) definition. They are a combination of the following bits:
128128

@@ -134,7 +134,7 @@ The semantics of Pre- and Post-Wrap modes follow the [Unity WrapMode](https://do
134134
| 4 | PingPong: When time reaches the end of the animation clip, time will ping pong back between beginning and end. |
135135
| 8 | ClampForever: Plays back the animation. When it reaches the end, it will keep playing the last frame and never stop playing. |
136136

137-
### Weighted Mode
137+
### Weighted mode
138138

139139
The semantics of the Weighted mode follow the [Unity WeightedMode](https://docs.unity3d.com/ScriptReference/WeightedMode.html) definition.
140140

Documentation/InputSimulation/InputAnimationRecording.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Input Animation Recording
1+
# Input animation recording
22

33
MRTK features an recording system by which head movement and hand tracking data can be stored in animation files. The recorded data can then be played back using the [input simulation system](InputSimulationService.md).
44

@@ -12,11 +12,11 @@ Recording input is a useful tool in a variety of situations:
1212
The recording system supports a "rolling buffer" concept that allows recording recent input in the background.
1313
See [Input Recording Service](#input-recording-service).
1414

15-
## Recording and Playback services
15+
## Recording and playback services
1616

1717
Two input system services are provided to record and play back input respectively.
1818

19-
### Input Recording Service
19+
### Input recording service
2020

2121
[`InputRecordingService`](xref:Microsoft.MixedReality.Toolkit.Input.InputRecordingService) takes data from the main camera transform and active hand controllers and stores it in an internal buffer. When requested this data is then serialized into binary files for storage and later replay.
2222

@@ -32,7 +32,7 @@ The data in the recording buffer can be saved in a binary file using the [SaveIn
3232

3333
For details on the binary file format see [Input Animation File Format Specification](InputAnimationFileFormat.md).
3434

35-
### Input Playback Service
35+
### Input playback service
3636

3737
[`InputPlaybackService`](xref:Microsoft.MixedReality.Toolkit.Input.InputPlaybackService) reads a binary file with input animation data and then applies this data through the [InputSimulationService](xref:Microsoft.MixedReality.Toolkit.Input.InputSimulationService) to recreate the recorded movements.
3838

0 commit comments

Comments
 (0)