You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CHANGELOG.md
+8-1Lines changed: 8 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,6 +9,14 @@ and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.
9
9
10
10
For general changes to PolySpatial, refer to the [PolySpatial Changelog](https://docs.unity3d.com/Packages/com.unity.polyspatial@latest?subfolder=/changelog/CHANGELOG.html).
11
11
12
+
## [1.3.1] - 2024-07-09
13
+
14
+
## [1.3.0] - 2024-06-26
15
+
16
+
### Added
17
+
- Added support for adding new reference images at runtime, refer to [ARFoundation](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@6.0/manual/features/image-tracking.html#add-new-reference-images-at-runtime) documentation.
18
+
- Added tracked image support to the "PolySpatial XR" Plug-in Provider, under XR Plug-in Managment.
19
+
12
20
## [1.2.3] - 2024-04-23
13
21
14
22
## [1.2.2] - 2024-04-22
@@ -18,7 +26,6 @@ For general changes to PolySpatial, refer to the [PolySpatial Changelog](https:/
18
26
## [1.2.0] - 2024-04-19
19
27
20
28
### Added
21
-
22
29
- Added a loading screen during initial Play To Device loading
23
30
- Added support for procedural skinned meshes. Updating a skinned mesh will now notify all skinned mesh renderers using that mesh to update.
Copy file name to clipboardExpand all lines: Documentation~/PlayToDevice.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -72,7 +72,7 @@ Additionally, the PlayToDevice UI will appear in the last used **VolumeCameraWin
72
72
73
73
## AR visualization in editor
74
74
75
-
You can visualize AR planes and hand data from your device in your editor when using the Play To Device feature. To enable this functionality go to `Project Settings` > `XR Plug-in Management`.
75
+
You can visualize AR planes, tracked images, meshes, and hand data from your device in your editor when using the Play To Device feature. To enable this functionality go to `Project Settings` > `XR Plug-in Management`.
76
76
Under the Standalone target look for `Plug-in Providers` turn on `PolySpatial XR`. Make sure to disable `XR Simulation` if it is on.
Copy file name to clipboardExpand all lines: Documentation~/ShaderGraph.md
+17-13Lines changed: 17 additions & 13 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -22,8 +22,8 @@ To obtain positions, normals, tangents, or bitangents in the world space of the
22
22
### Notes on Transform and Transformation Matrix nodes in VisionOS
23
23
The matrices returned by the `Transformation Matrix` node and used by the `Transform` node are obtained directly from visionOS and currently assume a world space that does not match either the simulation scene or the output of the `Position`, `Normal Vector`, `Tangent Vector`, or `Bitangent Vector` nodes. The "world space" output of those nodes is relative to the transform of the output volume--that is, it does not change when a bounded app volume is dragged around. The `Transform` and `Transformation Matrix` nodes, on the other hand, assume a world space that is shared between all app volumes. To get geometry in this world space, use the geometry (e.g., `Position`) node with `Space`: `Object` and transform it with the `Transform` node set to `From`: `Object` and `To`: `World`.
24
24
25
-
## Input properties
26
-
Shader graph properties must be set to `Exposed` in order to be set on a per-instance basis. Globals must *not* be `Exposed`, and global values must be set in C# using the methods of [PolySpatialShaderGlobals](https://docs.unity3d.com/Packages/com.unity.polyspatial@latest?subfolder=/api/Unity.PolySpatial.PolySpatialShaderGlobals.html#methods).
25
+
## Global properties
26
+
Global values must be set in C# using the methods of [PolySpatialShaderGlobals](https://docs.unity3d.com/Packages/com.unity.polyspatial@latest?subfolder=/api/Unity.PolySpatial.PolySpatialShaderGlobals.html#methods).
27
27
28
28
### Time-based animation
29
29
Note that visionOS materials do not support global properties natively, and thus PolySpatial must apply global properties separately to all material instances, which may affect performance. For animation, consider using the `PolySpatial Time` node rather than the standard Unity shader graph `Time`. While `PolySpatial Time` will not be exactly synchronized with [Time.time](https://docs.unity3d.com/ScriptReference/Time-time.html) (notably, it will not reflect changes to [Time.timeScale](https://docs.unity3d.com/ScriptReference/Time-timeScale.html)), it is supported natively in visionOS and does not require per-frame property updates.
@@ -113,7 +113,9 @@ If a node doesn't appear here it means that it's not currently supported. *Note
The **PolySpatialStaticBatchElement** component provides a hint to the platform that a `GameObject` containing a `MeshRenderer` will never move relative to a root `GameObject` or to the scene root (depending on the value of the `Root` property). This allows the platform to batch meshes that share the same root together, reducing the number of draw calls and (often) improving performance. For more information on static batching, refer to the documentation for [StaticBatchingUtility](https://docs.unity3d.com/ScriptReference/StaticBatchingUtility.html). Note that on visionOS, PolySpatial cannot separately control the visibility of batch elements. Elements with the same material and lighting parameters are simply combined into a single mesh and rendered together.
7
+
8
+
The `PolySpatial Static Batch Element` component exposes the following properties:
9
+
10
+
|**Property**|**Description**|
11
+
| --- | --- |
12
+
|**Root**| The root `GameObject` relative to which the element will stay fixed, or `None` if it will stay fixed relative to the scene root. |
13
+
|**Apply to Descendants**| If true, all descendants of the GameObject to which the component is attached will also be considered static with respect to the root. |
14
+
15
+
## Static Editor Flags
16
+
If [Static Batching](https://docs.unity3d.com/Manual/static-batching.html) is enabled in the Player settings, GameObjects with `Batching Static` enabled will automatically receive instances of `PolySpatialStaticBatchElement` that batch them relative to the scene root.
Copy file name to clipboardExpand all lines: Documentation~/visionOSPlatformOverview.md
+11Lines changed: 11 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -23,3 +23,14 @@ Unity supports several different application types on visionOS, each with their
23
23
* If you're interested in creating fully immersive virtual reality (VR) apps for visionOS, refer to [Fully Immersive VR apps on visionOS](VRApps.md) for more information.
24
24
* If you're interested in creating immersive mixed reality (MR) apps for visionOS, refer to [PolySpatial MR Apps on visionOS](PolySpatialMRApps.md) for more information. These apps are built with Unity's newly developed PolySpatial technology, where apps are simulated with Unity, but rendered with RealityKit, the system renderer of visionOS.
25
25
* If you're interested in creating content that will run in a window on visionOS, refer to [Windowed Apps on visionOS](WindowedApps.md) for more information.
26
+
27
+
### AR Authorizations
28
+
In order to use ARKit features like hand tracking and world sensing, your app must prompt the user for authorization. These prompts will display a customizable usage description, which must be provided in the visionOS settings under `Project Settings > XR Plug-in Management > Apple visionOS`. Unity apps can make use of ARKit features on visionOS by using [AR Foundation](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@latest) components like `ARPlaneManager`. For visionOS specifically, there are two types of AR Authorization:
29
+
- Hand Tracking
30
+
- World Sensing
31
+
As the name implies, hand tracking authorization is needed to make use of ARKit's hand tracking capabilities, and is exposed in Unity via the XR Hands package (`com.unity.xr.hands`). The World Sensing authorization applies to the remaining ARKit features like planes, meshes, image tracking, and world anchors. Note that head pose is exposed via ARKit, and is the one exception as it does not require any authorization.
32
+
33
+
These authorizations are requested automatically by the visionOS XR Plugin as features are needed. For example, when an `ARPlaneManager` is enabled, and the user has not already been prompted to authorize the app to use World Sensing features, a dialog will appear showing the world sensing usage description, with buttons labeled `Allow` or `Deny`. Once the user responds to this dialog, the authorization is stored along with other app metadata, and the authorization will remain valid until the app is uninstalled, or the user manually navigates to the app in Settings to change a particular authorization.
34
+
35
+
36
+
We provide scripting APIs for querying the state of a particular authorization. You can either call [VisionOS.QueryAuthorizationStatus](xref:UnityEngine.XR.VisionOS.VisionOS.QueryAuthorizationStatus)) to get the status of a particular authorization type, or you can subscribe to the [VisionOS.AuthorizationChanged](xref:UnityEngine.XR.VisionOS.VisionOS.AuthorizationChanged)) event in order to be informed of authorization changes. Usage of these APIs is demonstrated by the `Debug` UI panel in the main package sample scene for `com.unity.xr.visionos`.
0 commit comments