You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: Documentation/MixedRealityConfigurationGuide.md
+92-7Lines changed: 92 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,11 +2,23 @@
2
2
3
3

4
4
5
-
The Mixed Reality Toolkit centralizes as much of the configuration required to manage the toolkit as possible (except for true runtime "things").
5
+
The Mixed Reality Toolkit centralizes as much of the configuration required to manage the toolkit as possible (except for true runtime "things").
6
6
7
-
> The Mixed Reality Toolkit "locks" the default configuration screens to ensure you always have a common start point for your project and we encourage you to start defining your own settings as your project evolves.
7
+
**This guide is a simple walkthrough for each of the configuration screens currently available for the toolkit, more in-depth guides for each of the features is coming soon.**
8
+
9
+
Configuration profiles provide reusable blocks of configuration that can be used and swapped out at runtime (with the exception of the InputActions profile) to meet the demands for most Mixed Reality projects. This allows you to style your configuration for different input types (Driving vs Flying) or different behavior's your project needs.
10
+
11
+
> For more details on profile use, please check the [Profile Usage Guide]() (Coming soon()
12
+
13
+
In some cases, we also allow you to swap out the underlying system that provides a capability with either your own service or an alternate implementation (e.g. swapping out the speech provider from an OS version to one on Azure)
14
+
15
+
> For more detail on writing your own compatible systems for use in the toolkit, please see the [Guide to building Registered Services]() (Coming soon)
8
16
9
-
This are provided by a set of inspector screens which all start from the main configuration entry point in the *MixedRealityToolkit" GameObject in your Scene:
17
+
## The main Mixed Reality Toolkit Configuration profile
18
+
19
+
The main configuration profile, which is attached to the *MixedRealityToolkit* GameObject in your Scene, provides the main entry point for the Toolkit in your project.
20
+
21
+
> The Mixed Reality Toolkit "locks" the default configuration screens to ensure you always have a common start point for your project and we encourage you to start defining your own settings as your project evolves.
@@ -72,13 +84,13 @@ The camera settings define how the camera will be setup for your Mixed Reality p
72
84
73
85
## Input System Settings
74
86
75
-
The Mixed Reality Project provides a robust and welltrained input system for routing all the input events around the project which is selected by default.
87
+
The Mixed Reality Project provides a robust and well-trained input system for routing all the input events around the project which is selected by default.
76
88
77
89
> The MRTK also allows you to write your own Input System and you can use the selection below to switch the system used without rewriting the toolkit. For more information on writing your own systems, [please see this guide]() (Coming soon)
Behind the Input System provided by the MRTK are several other systems, these help to drive and manage the complex inter-weaving's required to abstract out the complexities of a multi-platform / mixed reality framework.
93
+
Behind the Input System provided by the MRTK are several other systems, these help to drive and manage the complex inter-weavings required to abstract out the complexities of a multi-platform / mixed reality framework.
@@ -98,7 +110,7 @@ Each of the individual profiles are detailed below:
98
110
99
111
## Boundary Visualization Settings
100
112
101
-
The boundary system translates the perceived boundary reported by the underlying platforms boundary / guardian system. The Boundary visualizer configuration gives you the ability to automatically show the recorded boundary within your scene relative to the users position. The boundary will also react / update based on where the user teleports within the scene.
113
+
The boundary system translates the perceived boundary reported by the underlying platforms boundary / guardian system. The Boundary visualizer configuration gives you the ability to automatically show the recorded boundary within your scene relative to the user's position. The boundary will also react / update based on where the user teleports within the scene.
@@ -142,8 +154,14 @@ This is only applicable for devices that can provide a scanned environment, such
142
154
143
155
## Diagnostics Settings
144
156
157
+
An optional but highly useful feature of the MRTK is the plugin Diagnostics functionality. This presents a style of debug log in to the scene
158
+
159
+
> The MRTK also allows you to write your own Diagnostic System and you can use the selection below to switch the system used without rewriting the toolkit. For more information on writing your own systems, [please see this guide]() (Coming soon)
The diagnostics profile provides several simple systems to monitor whilst the project is running, including a handy On/Off switch to enable / disable the display pane in the scene.
> Clicking on the "Back to Configuration Profile" button will take you back to the main Mixed Reality Toolkit Configuration screen.
@@ -152,6 +170,14 @@ This is only applicable for devices that can provide a scanned environment, such
152
170
153
171
## Additional Services Settings
154
172
173
+
One of the more advanced areas of the Mixed Reality Toolkit is its [service locator pattern](https://en.wikipedia.org/wiki/Service_locator_pattern) implementation which allows the registering of any "Service" with the framework. This allows the framework to be both extended with new features / systems easily but also allows for projects to take advantage of these capabilities to register their own runtime components.
174
+
175
+
> You can read more about the underlying framework and it's implementation in [Stephen Hodgson's article on the Mixed Reality Framework](https://medium.com/@stephen_hodgson/the-mixed-reality-framework-6fdb5c11feb2)
176
+
177
+
Any registered service still gets the full advantage of all of the Unity events, without the overhead and cost of implementing a MonoBehaviour or clunky singleton patterns. This allows for pure C# components with no scene overhead for running both foreground and background processes, e.g. spawning systems, runtime gamelogic, or practically anything else.
178
+
179
+
[Check out the supporting documentation for more details about creating your own Service Providers]() (Coming Soon)
> Clicking on the "Back to Configuration Profile" button will take you back to the main Mixed Reality Toolkit Configuration screen.
@@ -160,6 +186,25 @@ This is only applicable for devices that can provide a scanned environment, such
160
186
161
187
## Input Actions Settings
162
188
189
+
Input Actions provide a way to abstract any physical interactions and input from a runtime project. All physical input (from Controllers / hands / mouse / etc) is translated in to a logical Input Action for use in your runtime project. This ensures no matter where the input comes from, your project simply implements these actions as "Things to do" or "Interact with" in your scenes.
190
+
191
+
To create a new Input Action, simply click the "Add a new Action" button and enter a friendly text name for what it represents. You then only need select an Axis (the type of data) the action is meant to convey, or in the case of physical controllers, the physical input type it can be attached to, for example:
192
+
193
+
| Axis Constraint | Data Type | Description | Example use |
194
+
| :--- | :--- | :--- | :--- |
195
+
| None | No data | Used for an empty action or event | Event Trigger |
196
+
| Raw (reserved) | object | Reserved for future use | N/A |
197
+
| Digital | bool | A boolean on or off type data | A controller button |
198
+
| Single Axis | float | A single precision data value | A ranged input, e.g. a trigger |
199
+
| Dual Axis | Vector2 | A dual float type date for multiple axis | A Dpad or Thumbstick |
200
+
| Three Dof Position | Vector3 | Positional type data from with 3 float axis | 3D position style only controller |
201
+
| Three Dof Rotation | Quaternion | Rotational only input with 4 float axis | A Three degrees style controller, e.g. Oculus Go controller |
202
+
| Six Dof | Mixed Reality Pose (Vector3, Quaternion) | A position and rotation style input with both Vector3 and Quaternion components | A motion controller or Pointer |
203
+
204
+
Events utilizing Input Actions are not limited to physical controllers and can still be utilized within the project to have runtime effects generate new actions.
205
+
206
+
> Input Actions are one of the few components which is not editable at runtime, they are a design time configuration only. This profile should not be swapped out whilst the project is running due to the framework (and your projects) dependency on the ID's generated for each action.
> Clicking on the "Back to Configuration Profile" button will take you back to the Mixed Reality Toolkit Input System Settings screen.
@@ -168,6 +213,14 @@ This is only applicable for devices that can provide a scanned environment, such
168
213
169
214
## Input Actions Rules
170
215
216
+
Input Action Rules provide a way to automatically translate an event raised for one Input Action in to different actions based on its data value. These are managed seamlessly within the framework and do not incur any performance costs.
217
+
218
+
For example, converting the single Dual Axis input event from a DPad in to the 4 corresponding Dpad Up / DPad Down / Dpad Left / Dpad Right actions. (as shown in the image below)
219
+
220
+
> This could also be done i your own code, but seeing as this was a very common patter, the framework provides a mechanism to do this "out of the box"
221
+
222
+
Input Action Rules can be configured for any of the available input axis. However, Input actions from one Axis type can be translated to another Input Action of the same Axis type. You can map a Dual Axis action to another Dual Axis action, but not to a Digital or None action.
> Clicking on the "Back to Configuration Profile" button will take you back to the Mixed Reality Toolkit Input System Settings screen.
@@ -176,6 +229,12 @@ This is only applicable for devices that can provide a scanned environment, such
176
229
177
230
## Pointer Configuration
178
231
232
+
Pointers are used to drive interactivity in the scene from any input device, giving both a direction and hit test with any object in a scene (that has a collider attached, or is a UI component). Pointers are by default automatically configured for controllers, headsets (gaze/focus) and mouse/touch input.
233
+
234
+
Pointers can also be visualized within the active scene using one of the many Line components provided by the Mixed Reality Toolkit, or any of your own if they implement the MRTK IMixedRealityPointer interface.
235
+
236
+
> See the [Guide to Pointers documentation]()**Coming Soon** for more information on creating your own pointers.
> Clicking on the "Back to Configuration Profile" button will take you back to the Mixed Reality Toolkit Input System Settings screen.
@@ -184,6 +243,10 @@ This is only applicable for devices that can provide a scanned environment, such
184
243
185
244
## Gestures Configuration
186
245
246
+
Gestures are a system specific implementation allowing you to assign Input Actions to the various "Gesture" input methods provided by various SDK's (e.g. HoloLens).
247
+
248
+
> Note, the current implementation is for the HoloLens only and will be enhanced for other systems as they are added to the Toolkit in the future (no dates yet).
> Clicking on the "Back to Configuration Profile" button will take you back to the Mixed Reality Toolkit Input System Settings screen.
@@ -192,6 +255,10 @@ This is only applicable for devices that can provide a scanned environment, such
192
255
193
256
## Speech Commands
194
257
258
+
Like Gestures, some runtime platforms also provide intelligent Speech to Text functionality with the ability to generate "Commands" that can be received by a Unity project. This configuration profile allows you to configure registered "words" and translate them in to Input Actions that can be received by your project. (they can also be attached to keyboard actions if required)
259
+
260
+
> The system currently only supports speech when running on Windows 10 platforms, e.g. HoloLens and Windows 10 desktop and will be enhanced for other systems as they are added to the Toolkit in the future (no dates yet).
> Clicking on the "Back to Configuration Profile" button will take you back to the Mixed Reality Toolkit Input System Settings screen.
@@ -200,9 +267,13 @@ This is only applicable for devices that can provide a scanned environment, such
200
267
201
268
## Controller Mapping Configuration
202
269
270
+
One of the core configuration screens for the Mixed Reality Toolkit is the ability to configure and map the various types of controllers that can be utilized by your project.
271
+
272
+
The configuration screen below allows you to configure any of the controllers currently recognized by the toolkit.
The MRTK provides the default configuration for the following controllers / systems:
276
+
The MRTK provides a default configuration for the following controllers / systems:
206
277
207
278
* Mouse (including 3D spatial mouse support)
208
279
* Touch Screen
@@ -214,12 +285,26 @@ The MRTK provides the default configuration for the following controllers / syst
214
285
* Oculus Remote controller
215
286
* Generic OpenVR devices (advanced users only)
216
287
288
+
Clicking on the Image for any of the pre-built controller systems allows you to configure a single Input Action for all its corresponding inputs, for example, see the Oculus Touch controller configuration screen below:
There is also an advanced screen for configuring other OpenVR or Unity input controllers that are not identified above.
293
+
217
294
> Clicking on the "Back to Configuration Profile" button will take you back to the Mixed Reality Toolkit Input System Settings screen.
218
295
---
219
296
<aname="visualization"/>
220
297
221
298
## Controller Visualization Settings
222
299
300
+
In addition to the Controller mapping, a separate configuration profile is provided to customize how your controllers are presented within your scenes.
301
+
302
+
This can be configured at a "Global" (all instances of a controller for a specific hand) or specific to an individual controller type / hand.
303
+
304
+
> The MRTK does not currently support native SDK's controller models as Unity does not yet provide the capability to load / render gLTF models, which is the default type of models provided by most SDKs. This will be enhanced when this is available.
305
+
306
+
If your controller representation in the scene needs to be offset from the physical controller position, then simply set that offset against the controller model's prefab. (e.g. setting the transform position of the controller prefab with an offset position)
0 commit comments