You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: Documentation/DevDocGuide.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -68,7 +68,7 @@ The toc file in the root of the project defines entries in the top navigation ba
68
68
toc.yml files can be used for structuring and there can be any amount of those files. For more info about defining entries for toc.yml check the [docfx documentation entry on toc](https://dotnet.github.io/docfx/tutorial/intro_toc.html).
69
69
70
70
## Resource files
71
-
There are some files like images, videos or pdfs that the documentation can refer to but are not converted by docfx. For those files there's a resource section in the docfx.json. Files in that section will only be copied over without performing any conversion on them.
71
+
There are some files like images, videos or PDFs that the documentation can refer to but are not converted by docfx. For those files there's a resource section in the docfx.json. Files in that section will only be copied over without performing any conversion on them.
72
72
73
73
Currently there's a definition for the following resource types:
Copy file name to clipboardExpand all lines: Documentation/InputSimulation/InputSimulationService.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,9 +1,9 @@
1
1
# Input Simulation Service
2
2
3
3
The Input Simulation Service emulates the behaviour of devices and platforms that may not be available in the Unity editor. Examples include:
4
-
*Hololens or VR device head tracking
5
-
*Hololens hand gestures
6
-
*Hololens 2 articulated hand tracking
4
+
*HoloLens or VR device head tracking
5
+
*HoloLens hand gestures
6
+
*HoloLens 2 articulated hand tracking
7
7
8
8
Users can use a conventional keyboard and mouse combination to control simulated devices at runtime. This allows testing of interactions in the Unity editor without first deploying to a device.
Copy file name to clipboardExpand all lines: Documentation/MixedRealityConfigurationGuide.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -173,7 +173,7 @@ The diagnostics profile provides several simple systems to monitor whilst the pr
173
173
174
174
One of the more advanced areas of the Mixed Reality Toolkit is its [service locator pattern](https://en.wikipedia.org/wiki/Service_locator_pattern) implementation which allows the registering of any "Service" with the framework. This allows the framework to be both extended with new features / systems easily but also allows for projects to take advantage of these capabilities to register their own runtime components.
175
175
176
-
> You can read more about the underlying framework and it's implementation in [Stephen Hodgson's article on the Mixed Reality Framework](https://medium.com/@stephen_hodgson/the-mixed-reality-framework-6fdb5c11feb2)
176
+
> You can read more about the underlying framework and its implementation in [Stephen Hodgson's article on the Mixed Reality Framework](https://medium.com/@stephen_hodgson/the-mixed-reality-framework-6fdb5c11feb2)
177
177
178
178
Any registered service still gets the full advantage of all of the Unity events, without the overhead and cost of implementing a MonoBehaviour or clunky singleton patterns. This allows for pure C# components with no scene overhead for running both foreground and background processes, e.g. spawning systems, runtime gamelogic, or practically anything else.
179
179
@@ -253,7 +253,7 @@ There's an additional helper button to quickly jump to the Gaze Provider to over
253
253
254
254
## Gestures Configuration
255
255
256
-
Gestures are a system specific implementation allowing you to assign Input Actions to the various "Gesture" input methods provided by various SDK's (e.g. HoloLens).
256
+
Gestures are a system specific implementation allowing you to assign Input Actions to the various "Gesture" input methods provided by various SDKs (e.g. HoloLens).
257
257
258
258
> Note, the current implementation is for the HoloLens only and will be enhanced for other systems as they are added to the Toolkit in the future (no dates yet).
Copy file name to clipboardExpand all lines: Documentation/README_AppBar.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@
4
4
App Bar is a UI component used with Bounding Box. Using the 'Adjust' button, you can turn on/off the Bounding Box interface for manipulating object.
5
5
6
6
## How to use App Bar
7
-
Drog and drop **AppBar** prefab into the scene hierarchy. In the inspector panel of the AppBar, you will see **Bounding Box** under **Target Bounding Box** section. Assign any objects that has Bounding Box. **Important: Target object's Bounding Box activation option should be 'Activate Manually'**
7
+
Drag and drop **AppBar** prefab into the scene hierarchy. In the inspector panel of the AppBar, you will see **Bounding Box** under **Target Bounding Box** section. Assign any objects that has Bounding Box. **Important: Target object's Bounding Box activation option should be 'Activate Manually'**
In order for the bounding box edges to be highlighted the same way when moving it using [`ManipulationHandler`](README_ManipulationHandler.md)'s far interaction, it is advised to connect its events for **On Manipulation Started** / **On Manipulation Ended** to `BoundingBox.HightlightWires` / `BoundingBox.UnhighlightWires` respectively, as shown in the screenshot above.
64
+
In order for the bounding box edges to be highlighted the same way when moving it using [`ManipulationHandler`](README_ManipulationHandler.md)'s far interaction, it is advised to connect its events for **On Manipulation Started** / **On Manipulation Ended** to `BoundingBox.HighlightWires` / `BoundingBox.UnhighlightWires` respectively, as shown in the screenshot above.
The subtle pulse effect is triggerd by the `PressableButton.` The `PressableButton` looks for `ProximityLight(s)` that live on the currently interacting pointer. If any `ProximityLight(s)` are found, the ProximityLight.Pulse method is called which automatically animates shader parameters to display a pulse.
25
+
The subtle pulse effect is triggered by the `PressableButton.` The `PressableButton` looks for `ProximityLight(s)` that live on the currently interacting pointer. If any `ProximityLight(s)` are found, the ProximityLight.Pulse method is called which automatically animates shader parameters to display a pulse.
Copy file name to clipboardExpand all lines: Documentation/README_FingertipVisualization.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,7 +12,7 @@ By default the fingertip visualization will work in any Unity scene that is conf
12
12
- PokePointer
13
13
- FingerCursor
14
14
15
-
At a high level the fingertip visualization works by using a proximity light to project a colored gradient on any nearby surfaces that accept proximity lights. The finger cursor then looks for any nearby interactible surfaces, which are determined by parent IMixedRealityNearPointer(s), to align the finger ring with a surface as the finger moves towards a surface. As a finger approaches a surface the finger ring is also dynamically animated using the round corner properties of the MixedRealityStandard shader.
15
+
At a high level the fingertip visualization works by using a proximity light to project a colored gradient on any nearby surfaces that accept proximity lights. The finger cursor then looks for any nearby interactable surfaces, which are determined by parent IMixedRealityNearPointer(s), to align the finger ring with a surface as the finger moves towards a surface. As a finger approaches a surface the finger ring is also dynamically animated using the round corner properties of the MixedRealityStandard shader.
Copy file name to clipboardExpand all lines: Documentation/README_Interactable.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -69,7 +69,7 @@ Example of the Default Theme.
69
69
Example of a Color Theme
70
70
71
71
The best way to save a profile of a button, with all the themes and targets setup, is to create a prefab of your button.
72
-
_Note: Themes that effect mesh objects (Color or Shader Themes) are able to detect the shader properties in the material assigned to the target object. A drop down list of shader properties will define how the values of the theme are applied and is a conveience of this ability. Conflicts can arrise if the same theme is used on objects that do not share the same material shader setting. Best practice is to create a seperate theme for objects with different shaders; this is not an issue when using the same Color Theme on a text object and a mesh object, because all the shader properties are ignored on text objects._
72
+
_Note: Themes that effect mesh objects (Color or Shader Themes) are able to detect the shader properties in the material assigned to the target object. A drop down list of shader properties will define how the values of the theme are applied and is a convenience of this ability. Conflicts can arise if the same theme is used on objects that do not share the same material shader setting. Best practice is to create a separate theme for objects with different shaders; this is not an issue when using the same Color Theme on a text object and a mesh object, because all the shader properties are ignored on text objects._
73
73
74
74
### Creating Toggles
75
75
Toggle or multi-step buttons can be created in the Profile using the Dimensions field. The idea is that each set of states can have multiple dimensions and in this case, when the Dimensions value is increased, slots for additional themes are provided for each item in the Profile. This allows for a Normal Theme and a Toggled Theme to be used depending if the Interactable is toggled or not.
@@ -116,7 +116,7 @@ States are a list of terms that can be used to define interactions phases, like
116
116
117
117
Interactable States provides two major roles.
118
118
- Establish a list of states that we care about. This list will be displayed in the themes and can also be referenced by the events.
119
-
- Controls how different interaction phases are ranked into states. For instance, a press state is also in a focused state, but the InteractableStates class will define it is a press state based on the ranking preferences setup in the State scriptableObject.
119
+
- Controls how different interaction phases are ranked into states. For instance, a press state is also in a focused state, but the InteractableStates class will define it is a press state based on the ranking preferences setup in the State ScriptableObject.
Copy file name to clipboardExpand all lines: Documentation/README_ManipulationHandler.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -47,7 +47,7 @@ Specifies how the object will behave when it is being grabbed with one hand/cont
47
47
### One Hand Rotation Mode Options
48
48
* Maintain original rotation - does not rotate object as it is being moved
49
49
* Maintain rotation to user - maintains the object's original rotation to the user
50
-
* Gravity aligned maintain rotation to user - maintain's object's original rotation to user, but makes the object vertical. Useful for bounding boxes.
50
+
* Gravity aligned maintain rotation to user - maintains object's original rotation to user, but makes the object vertical. Useful for bounding boxes.
51
51
* Face user - ensures object always faces the user. Useful for slates/panels.
52
52
* Face away from user - ensures object always faces away from user. Useful for slates/panels that are configured backwards.
53
53
* Rotate about object center - Only works for articulated hands/controllers. Rotate object using rotation of the hand/controller, but about the object center point. Useful for inspecting at a distance.
Copy file name to clipboardExpand all lines: Documentation/README_Pointers.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -29,7 +29,7 @@ For other controllers like HoloLens 2 articulated hands, the rotation matches th
29
29
### GGV Pointer
30
30
GGV stands for "Gaze, Gesture, Voice"<sup>[2](https://docs.microsoft.com/en-us/windows/mixed-reality/gaze)</sup>. The GGV pointer's position and direction is driven by the head's position and rotation. The pointer is used to provide input that matches the HoloLens V1 input style of head gaze + airtap<sup>[3](https://docs.microsoft.com/en-us/windows/mixed-reality/gestures)</sup>.
31
31
32
-
In the pointer profile you can see that the V1 Hololens input system is provided for you via the mapping of "GGVHand" (V1 HoloLens hand) to the GGVPointer.
32
+
In the pointer profile you can see that the V1 HoloLens input system is provided for you via the mapping of "GGVHand" (V1 HoloLens hand) to the GGVPointer.
0 commit comments