Skip to content

Commit 2e6ee0f

Browse files
authored
Merge pull request #110182 from MicrosoftDocs/release-preview-arr
Release preview arr
2 parents 959f6dc + f378b63 commit 2e6ee0f

File tree

203 files changed

+7213
-3
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

203 files changed

+7213
-3
lines changed

articles/index.yml

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1188,6 +1188,13 @@ productDirectory:
11881188
- mixed-reality
11891189
url: spatial-anchors/index.yml
11901190
# Card
1191+
- title: Remote Rendering (Preview)
1192+
summary: Render high-quality, interactive 3D content, and stream it to your devices in real time
1193+
imageSrc: ./media/index/remote-rendering.svg
1194+
azureCategories:
1195+
- mixed-reality
1196+
url: remote-rendering/index.yml
1197+
# Card
11911198
- title: Visual Studio App Center
11921199
summary: Continuously build, test, release, and monitor your mobile and desktop apps
11931200
imageSrc: https://docs.microsoft.com/media/logos/logo_vs-mobile-center.svg
Lines changed: 30 additions & 0 deletions
Loading
Lines changed: 48 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,48 @@
1+
---
2+
title: Components
3+
description: Definition of components in the scope of Azure Remote Rendering
4+
author: florianborn71
5+
ms.author: flborn
6+
ms.date: 02/04/2020
7+
ms.topic: conceptual
8+
---
9+
10+
# Components
11+
12+
Azure Remote Rendering uses the [Entity Component System](https://en.wikipedia.org/wiki/Entity_component_system) pattern. While [entities](entities.md) represent the position and the hierarchical composition of objects, components are responsible for implementing behavior.
13+
14+
The most frequently used types of components are [mesh components](meshes.md), which add meshes into the rendering pipeline. Similarly, [light components](../overview/features/lights.md) are used to add lighting and [cut plane components](../overview/features/cut-planes.md) are used to cut open meshes.
15+
16+
All these components use the transform (position, rotation, scale) of the entity they are attached to, as their reference point.
17+
18+
## Working with components
19+
20+
You can easily add, remove, and manipulate components programmatically:
21+
22+
```cs
23+
// create a point light component
24+
AzureSession session = GetCurrentlyConnectedSession();
25+
PointLightComponent lightComponent = session.Actions.CreateComponent(ObjectType.PointLightComponent, ownerEntity) as PointLightComponent;
26+
27+
lightComponent.Color = new Color4Ub(255, 150, 20, 255);
28+
lightComponent.Intensity = 11;
29+
30+
// ...
31+
32+
// destroy the component
33+
lightComponent.Destroy();
34+
lightComponent = null;
35+
```
36+
37+
A component is attached to an entity at creation time. It cannot be moved to another entity afterwards. Components are explicitly deleted with `Component.Destroy()` or automatically when the component's owner entity is destroyed.
38+
39+
Only one instance of each component type may be added to an entity at a time.
40+
41+
## Unity specific
42+
43+
The Unity integration has additional extension functions for interacting with components. See [Unity game objects and components](../how-tos/unity/objects-components.md).
44+
45+
## Next steps
46+
47+
* [Object bounds](object-bounds.md)
48+
* [Meshes](meshes.md)
Lines changed: 82 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,82 @@
1+
---
2+
title: Entities
3+
description: Definition of entities in the scope of the Azure Remote Rendering API
4+
author: florianborn71
5+
ms.author: flborn
6+
ms.date: 02/03/2020
7+
ms.topic: conceptual
8+
---
9+
10+
# Entities
11+
12+
An *Entity* represents a movable object in space and is the fundamental building block of remotely rendered content.
13+
14+
## Entity properties
15+
16+
Entities have a transform defined by a position, rotation, and scale. By themselves entities do not have any observable functionality. Instead, behavior is added through components, which are attached to entities. For instance, attaching a [CutPlaneComponent](../overview/features/cut-planes.md) will create a cut plane at the position of the entity.
17+
18+
The most important aspect of the entity itself is the hierarchy and the resulting hierarchical transform. For example, when multiple entities are attached as children to a shared parent entity, all of these entities can be moved, rotated, and scaled in unison by changing the transform of the parent entity.
19+
20+
An entity is uniquely owned by its parent, meaning that when the parent is destroyed with `Entity.Destroy()`, so are its children and all connected [components](components.md). Thus, removing a model from the scene is accomplished by calling `Destroy` on the root node of a model, returned by `AzureSession.Actions.LoadModelAsync()` or its SAS variant `AzureSession.Actions.LoadModelFromSASAsync()`.
21+
22+
Entities are created when the server loads content or when the user wants to add an object to the scene. For example, if a user wants to add a cut plane to visualize the interior of a mesh, the user can create an entity where the plane should exist and then add the cut plane component to it.
23+
24+
## Query functions
25+
26+
There are two types of query functions on entities: synchronous and asynchronous calls. Synchronous queries can only be used for data that is present on the client and does not involve much computation. Examples are querying for components, relative object transforms, or parent/child relationships. Asynchronous queries are used for data that only resides on the server or involves extra computation that would be too expensive to run on the client. Examples are spatial bounds queries or meta data queries.
27+
28+
### Querying components
29+
30+
To find a component of a specific type, use `FindComponentOfType`:
31+
32+
```cs
33+
CutPlaneComponent cutplane = (CutPlaneComponent)entity.FindComponentOfType(ObjectType.CutPlaneComponent);
34+
35+
// or alternatively:
36+
CutPlaneComponent cutplane = entity.FindComponentOfType<CutPlaneComponent>();
37+
```
38+
39+
### Querying transforms
40+
41+
Transform queries are synchronous calls on the object. It is important to note that transforms queried through the API are local space transforms, relative to the object's parent. Exceptions are root objects, for which local space and world space are identical.
42+
43+
> [!NOTE]
44+
> There is no dedicated API to query the world space transform of arbitrary objects.
45+
46+
```cs
47+
// local space transform of the entity
48+
Double3 translation = entity.Position;
49+
Quaternion rotation = entity.Rotation;
50+
```
51+
52+
### Querying spatial bounds
53+
54+
Bounds queries are asynchronous calls that operate on a full object hierarchy, using one entity as a root. See the dedicated chapter about [object bounds](object-bounds.md).
55+
56+
### Querying metadata
57+
58+
Metadata is additional data stored on objects, that is ignored by the server. Object metadata is essentially a set of (name, value) pairs, where _value_ can be of numeric, boolean or string type. Metadata can be exported with the model.
59+
60+
Metadata queries are asynchronous calls on a specific entity. The query only returns the metadata of a single entity, not the merged information of a sub graph.
61+
62+
```cs
63+
MetadataQueryAsync metaDataQuery = entity.QueryMetaDataAsync();
64+
metaDataQuery.Completed += (MetadataQueryAsync query) =>
65+
{
66+
if (query.IsRanToCompletion)
67+
{
68+
ObjectMetaData metaData = query.Result;
69+
ObjectMetaDataEntry entry = metaData.GetMetadataByName("MyInt64Value");
70+
System.Int64 intValue = entry.AsInt64;
71+
72+
// ...
73+
}
74+
};
75+
```
76+
77+
The query will succeed even if the object does not hold any metadata.
78+
79+
## Next steps
80+
81+
* [Components](components.md)
82+
* [Object bounds](object-bounds.md)
Lines changed: 155 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,155 @@
1+
---
2+
title: Graphics binding
3+
description: Setup of graphics bindings and use cases
4+
author: florianborn71
5+
manager: jlyons
6+
services: azure-remote-rendering
7+
titleSuffix: Azure Remote Rendering
8+
ms.author: flborn
9+
ms.date: 12/11/2019
10+
ms.topic: conceptual
11+
ms.service: azure-remote-rendering
12+
---
13+
# Graphics binding
14+
15+
To be able to use Azure Remote Rendering in a custom application, it needs to be integrated into the application's rendering pipeline. This integration is the responsibility of the graphics binding.
16+
17+
Once set up, the graphics binding gives access to various functions that affect the rendered image. These functions can be separated into two categories: general functions that are always available and specific functions that are only relevant for the selected `Microsoft.Azure.RemoteRendering.GraphicsApiType`.
18+
19+
## Graphics binding in Unity
20+
21+
In Unity, the entire binding is handled by the `RemoteUnityClientInit` struct passed into `RemoteManagerUnity.InitializeManager`. To set the graphics mode, the `GraphicsApiType` field has to be set to the chosen binding. The field will be automatically populated depending on whether an XRDevice is present. The behavior can be manually overridden with the following behaviors:
22+
23+
* **HoloLens 2**: the [Windows Mixed Reality](#windows-mixed-reality) graphics binding is always used.
24+
* **Flat UWP desktop app**: [Simulation](#simulation) is always used. To use this mode, make sure to follow the steps in [Tutorial: Setting up a Unity project from scratch](../tutorials/unity/project-setup.md).
25+
* **Unity editor**: [Simulation](#simulation) is always used unless a WMR VR headset is connected in which case ARR will be disabled to allow to debug the non-ARR related parts of the application. See also [holographic remoting](../how-tos/unity/holographic-remoting.md).
26+
27+
The only other relevant part for Unity is accessing the [basic binding](#access), all the other sections below can be skipped.
28+
29+
## Graphics binding setup in custom applications
30+
31+
To select a graphics binding, take the following two steps: First, the graphics binding has to be statically initialized when the program is initialized:
32+
33+
``` cs
34+
RemoteRenderingInitialization managerInit = new RemoteRenderingInitialization;
35+
managerInit.graphicsApi = GraphicsApiType.WmrD3D11;
36+
managerInit.connectionType = ConnectionType.General;
37+
managerInit.right = ///...
38+
RemoteManagerStatic.StartupRemoteRendering(managerInit);
39+
```
40+
41+
The call above is necessary to initialize Azure Remote Rendering into the holographic APIs. This function must be called before any holographic API is called and before any other Remote Rendering APIs are accessed. Similarly, the corresponding de-init function `RemoteManagerStatic.ShutdownRemoteRendering();` should be called after no holographic APIs are being called anymore.
42+
43+
## <span id="access">Accessing graphics binding
44+
45+
Once a client is set up, the basic graphics binding can be accessed with the `AzureSession.GraphicsBinding` getter. As an example, the last frame statistics can be retrieved like this:
46+
47+
``` cs
48+
AzureSession currentSesson = ...;
49+
if (currentSesson.GraphicsBinding)
50+
{
51+
FrameStatistics frameStatistics;
52+
if (session.GraphicsBinding.GetLastFrameStatistics(out frameStatistics) == Result.Success)
53+
{
54+
...
55+
}
56+
}
57+
```
58+
59+
## Graphic APIs
60+
61+
There are currently two graphics APIs that can be selected, `WmrD3D11` and `SimD3D11`. A third one `Headless` exists but is not yet supported on the client side.
62+
63+
### Windows Mixed Reality
64+
65+
`GraphicsApiType.WmrD3D11` is the default binding to run on HoloLens 2. It will create the `GraphicsBindingWmrD3d11` binding. In this mode Azure Remote Rendering hooks directly into the holographic APIs.
66+
67+
To access the derived graphics bindings, the base `GraphicsBinding` has to be cast.
68+
There are two things that need to be done to use the WMR binding:
69+
70+
#### Inform Remote Rendering of the used coordinate system
71+
72+
``` cs
73+
AzureSession currentSesson = ...;
74+
IntPtr ptr = ...; // native pointer to ISpatialCoordinateSystem
75+
GraphicsBindingWmrD3d11 wmrBinding = (currentSession.GraphicsBinding as GraphicsBindingWmrD3d11);
76+
if (binding.UpdateUserCoordinateSystem(ptr) == Result.Success)
77+
{
78+
...
79+
}
80+
```
81+
82+
Where the above `ptr` must be a pointer to a native `ABI::Windows::Perception::Spatial::ISpatialCoordinateSystem` object that defines the world space coordinate system in which coordinates in the API are expressed in.
83+
84+
#### Render remote image
85+
86+
At the start of each frame the remote frame needs to be rendered into the back buffer. This is done by calling `BlitRemoteFrame`, which will fill both color and depth information into the currently bound render target. Thus it is important that this is done after binding the back buffer as a render target.
87+
88+
``` cs
89+
AzureSession currentSesson = ...;
90+
GraphicsBindingWmrD3d11 wmrBinding = (currentSession.GraphicsBinding as GraphicsBindingWmrD3d11);
91+
binding.BlitRemoteFrame();
92+
```
93+
94+
### Simulation
95+
96+
`GraphicsApiType.SimD3D11` is the simulation binding and if selected it creates the `GraphicsBindingSimD3d11` graphics binding. This interface is used to simulate head movement, for example in a desktop application and renders a monoscopic image.
97+
The setup is a bit more involved and works as follows:
98+
99+
#### Create proxy render target
100+
101+
Remote and local content needs to be rendered to an offscreen color / depth render target called 'proxy' using
102+
the proxy camera data provided by the `GraphicsBindingSimD3d11.Update` function. The proxy must match the resolution of the back buffer. Once a session is ready, `GraphicsBindingSimD3d11.InitSimulation` needs to be called before connecting to it:
103+
104+
``` cs
105+
AzureSession currentSesson = ...;
106+
IntPtr d3dDevice = ...; // native pointer to ID3D11Device
107+
IntPtr color = ...; // native pointer to ID3D11Texture2D
108+
IntPtr depth = ...; // native pointer to ID3D11Texture2D
109+
float refreshRate = 60.0f; // Monitor refresh rate up to 60hz.
110+
bool flipBlitRemoteFrameTextureVertically = false;
111+
bool flipReprojectTextureVertically = false;
112+
GraphicsBindingSimD3d11 simBinding = (currentSession.GraphicsBinding as GraphicsBindingSimD3d11);
113+
simBinding.InitSimulation(d3dDevice, depth, color, refreshRate, flipBlitRemoteFrameTextureVertically, flipReprojectTextureVertically);
114+
```
115+
116+
The init function needs to be provided with pointers to the native d3d-device as well as to the color and depth texture of the proxy render target. Once initialized, `AzureSession.ConnectToRuntime` and `DisconnectFromRuntime` can be called multiple times but when switching to a different session, `GraphicsBindingSimD3d11.DeinitSimulation` needs to be called first on the old session before `GraphicsBindingSimD3d11.InitSimulation` can be called on another session.
117+
118+
#### Render loop update
119+
120+
The render loop update consists of multiple steps:
121+
122+
1. Each frame, before any rendering takes place, `GraphicsBindingSimD3d11.Update` is called with the current camera transform that is sent over to the server to be rendered. At the same time the returned proxy transform should be applied to the proxy camera to render into the proxy render target.
123+
If the returned proxy update `SimulationUpdate.frameId` is null, there is no remote data yet. In this case, instead of rendering into the proxy render target, any local content should be rendered to the back buffer directly using the current camera data and the next two steps are skipped.
124+
1. The application should now bind the proxy render target and call `GraphicsBindingSimD3d11.BlitRemoteFrameToProxy`. This will fill the remote color and depth information into the proxy render target. Any local content can now be rendered onto the proxy using the proxy camera transform.
125+
1. Next, the back buffer needs to be bound as a render target and `GraphicsBindingSimD3d11.ReprojectProxy` called at which point the back buffer can be presented.
126+
127+
``` cs
128+
AzureSession currentSesson = ...;
129+
GraphicsBindingSimD3d11 simBinding = (currentSession.GraphicsBinding as GraphicsBindingSimD3d11);
130+
SimulationUpdate update = new SimulationUpdate();
131+
// Fill out camera data with current camera data
132+
...
133+
SimulationUpdate proxyUpdate = new SimulationUpdate();
134+
simBinding.Update(update, out proxyUpdate);
135+
// Is the frame data valid?
136+
if (proxyUpdate.frameId != 0)
137+
{
138+
// Bind proxy render target
139+
simBinding.BlitRemoteFrameToProxy();
140+
// Use proxy camera data to render local content
141+
...
142+
// Bind back buffer
143+
simBinding.ReprojectProxy();
144+
}
145+
else
146+
{
147+
// Bind back buffer
148+
// Use current camera data to render local content
149+
...
150+
}
151+
```
152+
153+
## Next steps
154+
155+
* [Tutorial: Setting up a Unity project from scratch](../tutorials/unity/project-setup.md)

0 commit comments

Comments
 (0)