You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
## [2.1.2] - 2024-11-26
### Added
### Changed
- Restored `PolySpatialWindowManagerAccess.entityForIdentifier`/`identifierForEntity` that was removed when adding multi-volume support and added `entitiesForUnityInstanceId` to return Entities from all volumes.
- Improved performance of blend shapes by enabling asynchronous processing.
- Entities merged via static batching now include the source entities' synchronized components (sort groups, image based light receivers, environment lighting configurations, grounding shadows, and hover effects).
- Updated Metal Samples presentation image and added known limitations section to the Play to Device Section.
### Deprecated
### Removed
### Fixed
- Fixed edge cases with blend shape support: support meshes with no vertices and meshes with no bone weights.
- Fixed issue where raycastable UGUI components would not be removed from scene when hidden (when scrolling, e.g.)
- Fixed a bug with tracked images over play to device.
### Security
Copy file name to clipboardExpand all lines: CHANGELOG.md
+21Lines changed: 21 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,6 +9,27 @@ and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.
9
9
10
10
For general changes to PolySpatial, refer to the [PolySpatial Changelog](https://docs.unity3d.com/Packages/com.unity.polyspatial@latest?subfolder=/changelog/CHANGELOG.html).
11
11
12
+
## [2.1.2] - 2024-11-26
13
+
14
+
### Added
15
+
16
+
### Changed
17
+
- Restored `PolySpatialWindowManagerAccess.entityForIdentifier`/`identifierForEntity` that was removed when adding multi-volume support and added `entitiesForUnityInstanceId` to return Entities from all volumes.
18
+
- Improved performance of blend shapes by enabling asynchronous processing.
19
+
- Entities merged via static batching now include the source entities' synchronized components (sort groups, image based light receivers, environment lighting configurations, grounding shadows, and hover effects).
20
+
- Updated Metal Samples presentation image and added known limitations section to the Play to Device Section.
21
+
22
+
### Deprecated
23
+
24
+
### Removed
25
+
26
+
### Fixed
27
+
- Fixed edge cases with blend shape support: support meshes with no vertices and meshes with no bone weights.
28
+
- Fixed issue where raycastable UGUI components would not be removed from scene when hidden (when scrolling, e.g.)
29
+
- Fixed a bug with tracked images over play to device.
|**Initialize Hand Tracking On Startup**| Initializes hand tracking when application starts up. |
28
-
|**Metal Immersion Style**| Defines the immersion style to be used in Metal-based apps. Immersion style determines whether an app will have pass-through. For Metal-based apps, only two immersion styles are valid - **Mixed** and **Full**. The **Progressive** immersion style is not supported with Metal-based apps. |
29
-
| *Automatic*| For Metal-based apps, choosing **Automatic** will cause the app to fallback to **Mixed** immersion style. |
30
-
| *Full*| With this immersion style, pass-through is disabled, and only the skybox will be shown. |
31
-
| *Mixed*| With this immersion style, apps will have pass-through video feed. This immersion style allows an app to switch at runtime between allowing pass-through and rendering the skybox. |
32
-
|**RealityKit Immersion Style**| Defines the immersion style to be used in RealityKit-based apps. Immersion style determines whether your application will have pass-through. Although all immersion styles are applicable to RealityKit-based apps, the effects of some immersion styles will differ depending on the **VolumeCamera****Mode**. |
33
-
| *Automatic*| For RealityKit-based apps, choosing **Automatic** will cause the app to fallback to **Mixed** immersion style. |
34
-
| *Full*| If all **VolumeCamera**s are set to **Bounded****Mode**, content will still appear with pass-through, as if the app was in **Mixed** immersion style. If one **VolumeCamera** in the scene is set to **Unbounded****Mode**, pass-through will be disabled and replaced with a black background, and only content will be visible. The **Digital Crown** will have no effect. |
35
-
| *Mixed*| With this immersion style, apps will have pass-through video feed. **VolumeCamera****Mode** will have no effect on this immersion style. |
36
-
| *Progressive*| If all **VolumeCamera**s are set to **Bounded****Mode**, content will still appear with pass-through, as if the app was in **Mixed** immersion style. If one **VolumeCamera** in the scene is set to **Unbounded****Mode**, content will appear inside a radial portal with a black background. See [Progressive Immersion](RealityKitApps.md#progressive-immersion) for more details. |
37
-
|**Upper Limb Visibility**| This option allows for controlling whether the hands are visible within the app. This option is ignored if the **App Mode** is Windowed, or if the **App Mode** is set to RealityKit but the scene consists of only **Bounded****VolumeCamera**s. |
38
-
| *Automatic*| Hands visibility will be determined by visionOS's default hands setting, which will depend on the context. |
39
-
| *Visible*| In **Metal****App Mode**, the hands will always be visible in the app. Hands always render on top of virtual content. In **RealityKit****App Mode**, the hands are blended with virtual content based on depth. |
40
-
| *Hidden*| In **Metal****App Mode**, the hands will be hidden, which may be beneficial if you wish to present virtual hands instead. In **RealityKit****App Mode**, the hands will be displayed behind virtual content. |
41
-
|**Foveated Rendering**| This option controls whether foveated rendering is enabled or disabled. This option only applies to Metal-based content, and requires the Universal Render Pipeline. |
42
-
|**IL2CPP Large Exe Workaround**| When building your project in Xcode, you may encounter an error such as `ARM64 branch out of range`, especially with larger projects. This option will patch the project to work around this error. |
43
-
44
-
After modifying the project settings, select and build for the visionOS platform in the **Build Profiles** menu. This will generate a Unity XCode Project. From here on, you'll continue the build process in the XCode project, debugging in XCode as needed.
22
+
The [visionOS project settings](VisionOSSettings.md) menu has a few properties pertaining to the **Metal Rendering with Compositor Services** app mode.
23
+
24
+
After modifying the visionOS settings, select and build for the visionOS platform in the **Build Profiles** menu. This will generate a Unity XCode Project. From here on, you'll continue the build process in the XCode project, debugging in XCode as needed.
45
25
46
26
>Please note that Unity is still building towards feature parity with the Metal API on XCode, so you might observe warnings from Metal’s API validation layer. To work around this, you can turn off the Metal API Validation Layer via XCode’s scheme menu.
Copy file name to clipboardExpand all lines: Documentation~/MetalSamples.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -18,7 +18,7 @@ The **XR Interaction Toolkit** is required by the sample scenes, but the **XR Ha
18
18
19
19
## Scene elements
20
20
21
-
The [ARSession](xref:arfoundation-session) and [ARInputManager](xref:arfoundation-session#ar-input-manager) components must exist in all Metal and Hybrid scenes on visionOS. These components are required because the visionOS platform provides head and hand tracking data through Apple's ARKit. These components can be on any GameObject in your scene, but only one of each should be present. In the Metal Sample scene, the components are on the GameObject named "AR Session." In addition to head and hand tracking data, ARKit on visionOS provides world meshes, planes, and image markers tracked by the device.
21
+
The [ARSession](xref:arfoundation-session) and [ARInputManager](xref:arfoundation-session#ar-input-manager) components must exist in all Metal and Hybrid scenes on visionOS. These components are required because the visionOS platform provides head and hand tracking data through Apple's ARKit. These components can be on any GameObject in your scene, but only one of each should be present. In the Metal Sample scene, the components are on the GameObject named "AR Session." In addition to head and hand tracking data, ARKit on visionOS provides world meshes, planes, and image markers tracked by the device. With visionOS 2.0 we support object tracking and environment probes as well.
22
22
23
23
> [!NOTE]
24
24
> On visionOS, Apple ARKit features are implemented by the [Apple visionOS XR Plugin](https://docs.unity3d.com/Packages/com.unity.xr.visionos@latest) package (com.unity.xr.visionos). You do not need the [Apple ARKit](https://docs.unity3d.com/Packages/com.unity.xr.arkit@latest) package (com.unity.xr.arkit), which implements ARKit features for iOS.
Copy file name to clipboardExpand all lines: Documentation~/PlayToDevice.md
+10Lines changed: 10 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -163,3 +163,13 @@ As explained above, you must use a version of the Play to Device Host app that m
163
163
* In the build details view, tap `Install`.
164
164
165
165
You can find more information about TestFlight and how to test previous builds in the [official documentation](https://testflight.apple.com/#testing-previous-builds).
166
+
167
+
## Limitations of using Play To Device
168
+
169
+
Even though the Play to Device feature is a powerful tool for iterating on your content, it will not replace how some features will work in a real device. Here are some limitations to keep in mind:
170
+
171
+
* Video Component: The [VisionOSVideoComponent](VideoComponent.md#visionosvideocomponent) may not work as expected when connected to a PlayToDevice host. Over a network, the VisionOSVideoComponent will not work, while the Unity VideoPlayer component will work, albeit with a performance cost.
172
+
* PlayToDevice may incur a performance hit when using render textures, and render textures may render slowly or may stutter while connected.
173
+
* Play to Device may not be able to accommodate your `VolumeCamera`'s specific `Output Dimension`. The number of available `VolumeCamera` window sizes (`OutputDimension`) is limited in PlayToDevice, and PlayToDevice will attempt to match user requested dimensions to available window sizes.
174
+
* Object tracking is not supported in the Play to Device app.
175
+
* XR Meshes over windows are not supported over Play to Device.
VisionOS provides image based lighting as well as dynamic point, spot, and directional lights. PolySpatial also includes a lighting solution available to shader graphs that provides a subset of the lighting features available in Unity. The PolySpatial Lighting Node supports directional lightmaps, light probes, and up to four dynamic lights (point, spot, or directional).
6
+
VisionOS provides image based lighting as well as dynamic point, spot, and directional lights. PolySpatial also includes a lighting solution available to shader graphs that provides a subset of the lighting features available in Unity. The `PolySpatial Lighting Data Extension` and `PolySpatial Lighting Node` support directional lightmaps, light probes, and up to four dynamic lights (point, spot, or directional).
7
7
8
8
## VisionOS Lighting
9
9
Lit materials (both standard shaders such as `Universal Render Pipeline/Lit` and shader graphs using Lit targets) use visionOS lighting by default. This lighting comprises [image based lighting](ImageBasedLight.md) (derived from the device cameras and/or environment maps) and, optionally, dynamic point, spot, and directional lights. Spot and directional lights support shadows.
10
10
11
-
Because visionOS uses different lighting calculations from Unity, the appearance of lit objects in visionOS will not exactly match that of objects rendered in the Unity editor. To achieve a closer match for dynamic lights, you may wish to use the `PolySpatial Lighting Node`.
11
+
Because visionOS uses different lighting calculations from Unity, the appearance of lit objects in visionOS will not exactly match that of objects rendered in the Unity editor. To achieve a closer match for dynamic lights, you may wish to use the `PolySpatial Lighting Data Extension` or `PolySpatial Lighting Node`.
12
12
13
13
### VisionOS Light Settings
14
14
To control the default behavior for dynamic lights in visionOS, use the `Default VisionOS Lighting` option under `PolySpatial` in `Project Settings`. This defaults to `Image Based Only`, but you can change it to `Image Based and Dynamic Lights` to enable point/spot/directional lights for standard Lit materials in visionOS, or to `Image Based, Dynamic Lights, and Shadows` to enable spot/directional shadows as well.
@@ -19,8 +19,22 @@ There are two additional settings that control shadow behavior specific to visio
19
19
20
20
To control visionOS light behavior on a per-light basis, add an instance of the `VisionOS Light Settings` component to the `GameObject` containing the `Light` instance. This has the same options as the defaults in the `PolySpatial` project settings.
21
21
22
+
## PolySpatial Lighting Data Extension
23
+
To add PolySpatial lighting to a shader graph with a `Lit` material, add the `PolySpatial Lighting` data extension to the `Data Extension Settings` section, located under `Target Settings` in the `Graph Inspector`. This will cause the MaterialX version of the shader graph used in visionOS to include lighting functionality as determined by the extension options:
24
+
25
+
|**Option**|**Description**|
26
+
| --- | --- |
27
+
|**Baked Lighting**| The Unity baked lighting to include: either `None`, `Lightmap` or `LightProbes`. |
28
+
|**Reflection Probes**| The Unity reflection probe contribution: either `None`, `Simple`, or `Blended`. |
|**VisionOS Lighting**| If true, *also* apply visionOS lighting (imaged based and dynamic point/spot/directional lights, as described in the previous section). |
31
+
32
+
In the Unity editor view (in or out of play mode), the `PolySpatial Lighting Data Extension` will have no visible effect: lit shader graphs will simply use Unity's standard lighting model. The options in the extension apply only to the appearance of the shader graph in visionOS. Without the extension, in visionOS, lit shader graphs will behave as if only the `VisionOS Lighting` option was enabled.
33
+
34
+
The Unity-specific lightmap, light probe, reflection probe, and dynamic lighting is subject to the limitations of the `PolySpatial Lighting Node` described in the following section.
35
+
22
36
## PolySpatial Lighting Node
23
-
To add PolySpatial lighting to a shader graph, create an instance of the `PolySpatial Lighting Node` using the `Create Node` command in the shader graph editor.
37
+
As an alternative to the `PolySpatial Lighting Data Extension`, you may instead include PolySpatial lighting by creating an instance of the `PolySpatial Lighting Node` using the `Create Node` command in the shader graph editor. This allows you to process the lighting output before routing it to an output block (such as `Base Color` or `Emission`), but it requires that you explicitly provide the lighting inputs such as `Metallic` and `Smoothness` to the node and it *will only appear correctly in builds and in play mode*. Because it requires information from the PolySpatial runtime, it will not appear correctly in material previews or the editor scene view.
24
38
25
39
### Inputs
26
40
The inputs to the lighting node are largely the same as the inputs to the `Lit` shader graph target--`Base Color` (albedo), `Normal` (in tangent space), `Metallic`, `Smoothness`, `Emission`, and `Ambient Occlusion`--with the additional of a `Lightmap UV` input for lightmap texture coordinates.
0 commit comments