Skip to content

Commit 89835b8

Browse files
committed
Spell check: documentation pass
1 parent f9f3369 commit 89835b8

13 files changed

+20
-20
lines changed

Documentation/DevDocGuide.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -68,7 +68,7 @@ The toc file in the root of the project defines entries in the top navigation ba
6868
toc.yml files can be used for structuring and there can be any amount of those files. For more info about defining entries for toc.yml check the [docfx documentation entry on toc](https://dotnet.github.io/docfx/tutorial/intro_toc.html).
6969

7070
## Resource files
71-
There are some files like images, videos or pdfs that the documentation can refer to but are not converted by docfx. For those files there's a resource section in the docfx.json. Files in that section will only be copied over without performing any conversion on them.
71+
There are some files like images, videos or PDFs that the documentation can refer to but are not converted by docfx. For those files there's a resource section in the docfx.json. Files in that section will only be copied over without performing any conversion on them.
7272

7373
Currently there's a definition for the following resource types:
7474

Documentation/InputSimulation/InputSimulationService.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,9 @@
11
# Input Simulation Service
22

33
The Input Simulation Service emulates the behaviour of devices and platforms that may not be available in the Unity editor. Examples include:
4-
* Hololens or VR device head tracking
5-
* Hololens hand gestures
6-
* Hololens 2 articulated hand tracking
4+
* HoloLens or VR device head tracking
5+
* HoloLens hand gestures
6+
* HoloLens 2 articulated hand tracking
77

88
Users can use a conventional keyboard and mouse combination to control simulated devices at runtime. This allows testing of interactions in the Unity editor without first deploying to a device.
99

Documentation/MixedRealityConfigurationGuide.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -173,7 +173,7 @@ The diagnostics profile provides several simple systems to monitor whilst the pr
173173

174174
One of the more advanced areas of the Mixed Reality Toolkit is its [service locator pattern](https://en.wikipedia.org/wiki/Service_locator_pattern) implementation which allows the registering of any "Service" with the framework. This allows the framework to be both extended with new features / systems easily but also allows for projects to take advantage of these capabilities to register their own runtime components.
175175

176-
> You can read more about the underlying framework and it's implementation in [Stephen Hodgson's article on the Mixed Reality Framework](https://medium.com/@stephen_hodgson/the-mixed-reality-framework-6fdb5c11feb2)
176+
> You can read more about the underlying framework and its implementation in [Stephen Hodgson's article on the Mixed Reality Framework](https://medium.com/@stephen_hodgson/the-mixed-reality-framework-6fdb5c11feb2)
177177
178178
Any registered service still gets the full advantage of all of the Unity events, without the overhead and cost of implementing a MonoBehaviour or clunky singleton patterns. This allows for pure C# components with no scene overhead for running both foreground and background processes, e.g. spawning systems, runtime gamelogic, or practically anything else.
179179

@@ -253,7 +253,7 @@ There's an additional helper button to quickly jump to the Gaze Provider to over
253253

254254
## Gestures Configuration
255255

256-
Gestures are a system specific implementation allowing you to assign Input Actions to the various "Gesture" input methods provided by various SDK's (e.g. HoloLens).
256+
Gestures are a system specific implementation allowing you to assign Input Actions to the various "Gesture" input methods provided by various SDKs (e.g. HoloLens).
257257

258258
> Note, the current implementation is for the HoloLens only and will be enhanced for other systems as they are added to the Toolkit in the future (no dates yet).
259259

Documentation/README_AppBar.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
App Bar is a UI component used with Bounding Box. Using the 'Adjust' button, you can turn on/off the Bounding Box interface for manipulating object.
55

66
## How to use App Bar
7-
Drog and drop **AppBar** prefab into the scene hierarchy. In the inspector panel of the AppBar, you will see **Bounding Box** under **Target Bounding Box** section. Assign any objects that has Bounding Box. **Important: Target object's Bounding Box activation option should be 'Activate Manually'**
7+
Drag and drop **AppBar** prefab into the scene hierarchy. In the inspector panel of the AppBar, you will see **Bounding Box** under **Target Bounding Box** section. Assign any objects that has Bounding Box. **Important: Target object's Bounding Box activation option should be 'Activate Manually'**
88

99
<img src="/External/ReadMeImages/AppBar/MRTK_AppBar_Setup1.png" width="450">
1010

Documentation/README_BoundingBox.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -61,4 +61,4 @@ If you want to make the object movable using far interaction, you can combine [`
6161

6262
<img src="/External/ReadMeImages/BoundingBox/MRTK_BoundingBox_ManipulationHandler.png" width="450">
6363

64-
In order for the bounding box edges to be highlighted the same way when moving it using [`ManipulationHandler`](README_ManipulationHandler.md)'s far interaction, it is advised to connect its events for **On Manipulation Started** / **On Manipulation Ended** to `BoundingBox.HightlightWires` / `BoundingBox.UnhighlightWires` respectively, as shown in the screenshot above.
64+
In order for the bounding box edges to be highlighted the same way when moving it using [`ManipulationHandler`](README_ManipulationHandler.md)'s far interaction, it is advised to connect its events for **On Manipulation Started** / **On Manipulation Ended** to `BoundingBox.HighlightWires` / `BoundingBox.UnhighlightWires` respectively, as shown in the screenshot above.

Documentation/README_Button.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ In the idle state, the button's front plate is not visible. As a finger approach
2222

2323
<img src="/External/ReadMeImages/Button/MRTK_Button_InteractionStates.png" width="600">
2424

25-
The subtle pulse effect is triggerd by the `PressableButton.` The `PressableButton` looks for `ProximityLight(s)` that live on the currently interacting pointer. If any `ProximityLight(s)` are found, the ProximityLight.Pulse method is called which automatically animates shader parameters to display a pulse.
25+
The subtle pulse effect is triggered by the `PressableButton.` The `PressableButton` looks for `ProximityLight(s)` that live on the currently interacting pointer. If any `ProximityLight(s)` are found, the ProximityLight.Pulse method is called which automatically animates shader parameters to display a pulse.
2626

2727
## Property Inspector of PressableButton
2828
![Button](/External/ReadMeImages/Button/MRTK_Button_Structure.png)

Documentation/README_FingertipVisualization.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ By default the fingertip visualization will work in any Unity scene that is conf
1212
- PokePointer
1313
- FingerCursor
1414

15-
At a high level the fingertip visualization works by using a proximity light to project a colored gradient on any nearby surfaces that accept proximity lights. The finger cursor then looks for any nearby interactible surfaces, which are determined by parent IMixedRealityNearPointer(s), to align the finger ring with a surface as the finger moves towards a surface. As a finger approaches a surface the finger ring is also dynamically animated using the round corner properties of the MixedRealityStandard shader.
15+
At a high level the fingertip visualization works by using a proximity light to project a colored gradient on any nearby surfaces that accept proximity lights. The finger cursor then looks for any nearby interactable surfaces, which are determined by parent IMixedRealityNearPointer(s), to align the finger ring with a surface as the finger moves towards a surface. As a finger approaches a surface the finger ring is also dynamically animated using the round corner properties of the MixedRealityStandard shader.
1616

1717
### Example Scene ###
1818

Documentation/README_Interactable.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@ Example of the Default Theme.
6969
Example of a Color Theme
7070

7171
The best way to save a profile of a button, with all the themes and targets setup, is to create a prefab of your button.
72-
_Note: Themes that effect mesh objects (Color or Shader Themes) are able to detect the shader properties in the material assigned to the target object. A drop down list of shader properties will define how the values of the theme are applied and is a conveience of this ability. Conflicts can arrise if the same theme is used on objects that do not share the same material shader setting. Best practice is to create a seperate theme for objects with different shaders; this is not an issue when using the same Color Theme on a text object and a mesh object, because all the shader properties are ignored on text objects._
72+
_Note: Themes that effect mesh objects (Color or Shader Themes) are able to detect the shader properties in the material assigned to the target object. A drop down list of shader properties will define how the values of the theme are applied and is a convenience of this ability. Conflicts can arise if the same theme is used on objects that do not share the same material shader setting. Best practice is to create a separate theme for objects with different shaders; this is not an issue when using the same Color Theme on a text object and a mesh object, because all the shader properties are ignored on text objects._
7373

7474
### Creating Toggles
7575
Toggle or multi-step buttons can be created in the Profile using the Dimensions field. The idea is that each set of states can have multiple dimensions and in this case, when the Dimensions value is increased, slots for additional themes are provided for each item in the Profile. This allows for a Normal Theme and a Toggled Theme to be used depending if the Interactable is toggled or not. 
@@ -116,7 +116,7 @@ States are a list of terms that can be used to define interactions phases, like
116116

117117
Interactable States provides two major roles.
118118
- Establish a list of states that we care about. This list will be displayed in the themes and can also be referenced by the events.
119-
- Controls how different interaction phases are ranked into states. For instance, a press state is also in a focused state, but the InteractableStates class will define it is a press state based on the ranking preferences setup in the State scriptableObject.
119+
- Controls how different interaction phases are ranked into states. For instance, a press state is also in a focused state, but the InteractableStates class will define it is a press state based on the ranking preferences setup in the State ScriptableObject.
120120

121121
<img src="/External/ReadMeImages/Interactable/StatesScriptableObject.png" width="450">
122122

@@ -232,7 +232,7 @@ public override void OnUpdate(InteractableStates state, Interactable source)
232232
bool isDefault = state.GetState(InteractableStates.InteractableStateEnum.Default).Value > 0;
233233
bool hasGesture = state.GetState(InteractableStates.InteractableStateEnum.Gesture).Value > 0;
234234
bool hasGestureMax = state.GetState(InteractableStates.InteractableStateEnum.GestureMax).Value > 0;
235-
bool hasCollistion = state.GetState(InteractableStates.InteractableStateEnum.Collision).Value > 0;
235+
bool hasCollision = state.GetState(InteractableStates.InteractableStateEnum.Collision).Value > 0;
236236
bool hasPhysicalTouch = state.GetState(InteractableStates.InteractableStateEnum.PhysicalTouch).Value > 0;
237237
bool hasCustom = state.GetState(InteractableStates.InteractableStateEnum.Custom).Value > 0;
238238
or:

Documentation/README_ManipulationHandler.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,7 @@ Specifies how the object will behave when it is being grabbed with one hand/cont
4747
### One Hand Rotation Mode Options
4848
* Maintain original rotation - does not rotate object as it is being moved
4949
* Maintain rotation to user - maintains the object's original rotation to the user
50-
* Gravity aligned maintain rotation to user - maintain's object's original rotation to user, but makes the object vertical. Useful for bounding boxes.
50+
* Gravity aligned maintain rotation to user - maintains object's original rotation to user, but makes the object vertical. Useful for bounding boxes.
5151
* Face user - ensures object always faces the user. Useful for slates/panels.
5252
* Face away from user - ensures object always faces away from user. Useful for slates/panels that are configured backwards.
5353
* Rotate about object center - Only works for articulated hands/controllers. Rotate object using rotation of the hand/controller, but about the object center point. Useful for inspecting at a distance.

Documentation/README_Pointers.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ For other controllers like HoloLens 2 articulated hands, the rotation matches th
2929
### GGV Pointer
3030
GGV stands for "Gaze, Gesture, Voice"<sup>[2](https://docs.microsoft.com/en-us/windows/mixed-reality/gaze)</sup>. The GGV pointer's position and direction is driven by the head's position and rotation. The pointer is used to provide input that matches the HoloLens V1 input style of head gaze + airtap<sup>[3](https://docs.microsoft.com/en-us/windows/mixed-reality/gestures)</sup>.
3131

32-
In the pointer profile you can see that the V1 Hololens input system is provided for you via the mapping of "GGVHand" (V1 HoloLens hand) to the GGVPointer.
32+
In the pointer profile you can see that the V1 HoloLens input system is provided for you via the mapping of "GGVHand" (V1 HoloLens hand) to the GGVPointer.
3333

3434
<img src="/External/ReadMeImages/Pointers/MRTK_GGVPointer_HL1.jpg" width="600">
3535

0 commit comments

Comments
 (0)