Skip to content

Commit 43ed1b1

Browse files
Merge pull request #265063 from jowang-msft/jowang-msft/unityrawvideo1
Add ACS Unity SDK raw media quick start sample doc
2 parents 07ca8fb + bc19f0e commit 43ed1b1

File tree

2 files changed

+228
-1
lines changed

2 files changed

+228
-1
lines changed

articles/communication-services/quickstarts/voice-video-calling/get-started-raw-media-access.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,11 +9,14 @@ ms.date: 06/30/2022
99
ms.topic: quickstart
1010
ms.service: azure-communication-services
1111
ms.subservice: calling
12-
zone_pivot_groups: acs-plat-web-ios-android-windows
12+
zone_pivot_groups: acs-plat-web-ios-android-windows-unity
1313
ms.custom: mode-other, devx-track-js
1414
---
1515

1616
# Quickstart: Add raw media access to your app
17+
::: zone pivot="platform-unity"
18+
[!INCLUDE [Raw media with Unity](./includes/raw-media/raw-media-access-unity.md)]
19+
::: zone-end
1720

1821
::: zone pivot="platform-windows"
1922
[!INCLUDE [Raw media with Windows](./includes/raw-media/raw-media-access-windows.md)]
Lines changed: 224 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,224 @@
1+
---
2+
title: Quickstart - Add raw media access to your app (Unity)
3+
titleSuffix: An Azure Communication Services quickstart
4+
description: In this quickstart, you learn how to add raw media access calling capabilities to your Unity app by using Azure Communication Services.
5+
author: jowang-msft
6+
ms.author: jowang
7+
ms.date: 02/02/2024
8+
ms.topic: include
9+
ms.service: azure-communication-services
10+
ms.subservice: calling
11+
---
12+
13+
In this quickstart, you learn how to implement raw media access by using the Azure Communication Services Calling SDK for Unity.
14+
The Azure Communication Services Calling SDK offers APIs that allow apps to generate their own video frames to send or render raw video frames from remote participants in a call.
15+
This quickstart builds on [Quickstart: Add 1:1 video calling to your app](../../get-started-with-video-calling.md?pivots=platform-unity) for Unity.
16+
17+
## RawVideo access
18+
19+
Because the app generates the video frames, the app must inform the Azure Communication Services Calling SDK about the video formats that the app can generate. This information allows the Azure Communication Services Calling SDK to pick the best video format configuration for the network conditions at that time.
20+
21+
## Virtual Video
22+
23+
### Supported video resolutions
24+
| Aspect ratio | Resolution | Maximum FPS |
25+
| :--: | :-: | :-: |
26+
| 16x9 | 1080p | 30 |
27+
| 16x9 | 720p | 30 |
28+
| 16x9 | 540p | 30 |
29+
| 16x9 | 480p | 30 |
30+
| 16x9 | 360p | 30 |
31+
| 16x9 | 270p | 15 |
32+
| 16x9 | 240p | 15 |
33+
| 16x9 | 180p | 15 |
34+
| 4x3 | VGA (640x480) | 30 |
35+
| 4x3 | 424x320 | 15 |
36+
| 4x3 | QVGA (320x240) | 15 |
37+
| 4x3 | 212x160 | 15 |
38+
39+
1. Follow the steps here [Quickstart: Add 1:1 video calling to your app](../../get-started-with-video-calling.md?pivots=platform-unity) to create Unity game. The goal is to obtain a `CallAgent` object ready to begin the call.
40+
Find the finalized code for this quickstart on [GitHub](https://github.com/Azure-Samples/communication-services-dotnet-quickstarts/tree/main/Unity/RawVideo).
41+
2. Create an array of `VideoFormat` using the VideoStreamPixelFormat the SDK supports.
42+
When multiple formats are available, the order of the formats in the list doesn't influence or prioritize which one is used. The criteria for format selection are based on external factors like network bandwidth.
43+
```csharp
44+
var videoStreamFormat = new VideoStreamFormat
45+
{
46+
Resolution = VideoStreamResolution.P360, // For VirtualOutgoingVideoStream the width/height should be set using VideoStreamResolution enum
47+
PixelFormat = VideoStreamPixelFormat.Rgba,
48+
FramesPerSecond = 15,
49+
Stride1 = 640 * 4 // It is times 4 because RGBA is a 32-bit format
50+
};
51+
VideoStreamFormat[] videoStreamFormats = { videoStreamFormat };
52+
```
53+
3. Create `RawOutgoingVideoStreamOptions`, and set `Formats` with the previously created object.
54+
```csharp
55+
var rawOutgoingVideoStreamOptions = new RawOutgoingVideoStreamOptions
56+
{
57+
Formats = videoStreamFormats
58+
};
59+
```
60+
3. Create an instance of `VirtualOutgoingVideoStream` by using the `RawOutgoingVideoStreamOptions` instance that you created previously.
61+
```csharp
62+
var rawOutgoingVideoStream = new VirtualOutgoingVideoStream(rawOutgoingVideoStreamOptions);
63+
```
64+
4. Subscribe to the `RawOutgoingVideoStream.FormatChanged` delegate. This event informs whenever the `VideoStreamFormat` has been changed from one of the video formats provided on the list.
65+
```csharp
66+
rawOutgoingVideoStream.FormatChanged += (object sender, VideoStreamFormatChangedEventArgs args)
67+
{
68+
VideoStreamFormat videoStreamFormat = args.Format;
69+
}
70+
```
71+
5. Subscribe to the `RawOutgoingVideoStream.StateChanged` delegate. This event informs whenever the `State` has changed.
72+
```csharp
73+
rawOutgoingVideoStream.StateChanged += (object sender, VideoStreamFormatChangedEventArgs args)
74+
{
75+
CallVideoStream callVideoStream = e.Stream;
76+
77+
switch (callVideoStream.Direction)
78+
{
79+
case StreamDirection.Outgoing:
80+
OnRawOutgoingVideoStreamStateChanged(callVideoStream as OutgoingVideoStream);
81+
break;
82+
case StreamDirection.Incoming:
83+
OnRawIncomingVideoStreamStateChanged(callVideoStream as IncomingVideoStream);
84+
break;
85+
}
86+
}
87+
```
88+
6. Handle raw outgoing video stream state transactions such as Start and Stop and begin to generate custom video frames or suspend the frame generating algorithm.
89+
```csharp
90+
private async void OnRawOutgoingVideoStreamStateChanged(OutgoingVideoStream outgoingVideoStream)
91+
{
92+
switch (outgoingVideoStream.State)
93+
{
94+
case VideoStreamState.Started:
95+
switch (outgoingVideoStream.Kind)
96+
{
97+
case VideoStreamKind.VirtualOutgoing:
98+
outgoingVideoPlayer.StartGenerateFrames(outgoingVideoStream); // This is where a background worker thread can be started to feed the outgoing video frames.
99+
break;
100+
}
101+
break;
102+
103+
case VideoStreamState.Stopped:
104+
switch (outgoingVideoStream.Kind)
105+
{
106+
case VideoStreamKind.VirtualOutgoing:
107+
break;
108+
}
109+
break;
110+
}
111+
}
112+
```
113+
Here is a sample of outgoing video frame generator:
114+
```csharp
115+
private unsafe RawVideoFrame GenerateRawVideoFrame(RawOutgoingVideoStream rawOutgoingVideoStream)
116+
{
117+
var format = rawOutgoingVideoStream.Format;
118+
int w = format.Width;
119+
int h = format.Height;
120+
int rgbaCapacity = w * h * 4;
121+
122+
var rgbaBuffer = new NativeBuffer(rgbaCapacity);
123+
rgbaBuffer.GetData(out IntPtr rgbaArrayBuffer, out rgbaCapacity);
124+
125+
byte r = (byte)random.Next(1, 255);
126+
byte g = (byte)random.Next(1, 255);
127+
byte b = (byte)random.Next(1, 255);
128+
129+
for (int y = 0; y < h; y++)
130+
{
131+
for (int x = 0; x < w*4; x += 4)
132+
{
133+
((byte*)rgbaArrayBuffer)[(w * 4 * y) + x + 0] = (byte)(y % r);
134+
((byte*)rgbaArrayBuffer)[(w * 4 * y) + x + 1] = (byte)(y % g);
135+
((byte*)rgbaArrayBuffer)[(w * 4 * y) + x + 2] = (byte)(y % b);
136+
((byte*)rgbaArrayBuffer)[(w * 4 * y) + x + 3] = 255;
137+
}
138+
}
139+
140+
// Call ACS Unity SDK API to deliver the frame
141+
rawOutgoingVideoStream.SendRawVideoFrameAsync(new RawVideoFrameBuffer() {
142+
Buffers = new NativeBuffer[] { rgbaBuffer },
143+
StreamFormat = rawOutgoingVideoStream.Format,
144+
TimestampInTicks = rawOutgoingVideoStream.TimestampInTicks
145+
}).Wait();
146+
147+
return new RawVideoFrameBuffer()
148+
{
149+
Buffers = new NativeBuffer[] { rgbaBuffer },
150+
StreamFormat = rawOutgoingVideoStream.Format
151+
};
152+
}
153+
```
154+
> [!NOTE]
155+
> `unsafe` modifier is used on this method since `NativeBuffer` requires access to native memory resources. Therefore, `Allow unsafe` option needs to be enabled in Unity Editor as well.
156+
157+
7. Similarly, we can handle incoming video frames in response to video stream `StateChanged` event.
158+
```csharp
159+
private void OnRawIncomingVideoStreamStateChanged(IncomingVideoStream incomingVideoStream)
160+
{
161+
switch (incomingVideoStream.State)
162+
{
163+
case VideoStreamState.Available:
164+
{
165+
var rawIncomingVideoStream = incomingVideoStream as RawIncomingVideoStream;
166+
rawIncomingVideoStream.RawVideoFrameReceived += OnRawVideoFrameReceived;
167+
rawIncomingVideoStream.Start();
168+
break;
169+
}
170+
case VideoStreamState.Stopped:
171+
break;
172+
case VideoStreamState.NotAvailable:
173+
break;
174+
}
175+
}
176+
177+
private void OnRawVideoFrameReceived(object sender, RawVideoFrameReceivedEventArgs e)
178+
{
179+
incomingVideoPlayer.RenderRawVideoFrame(e.Frame);
180+
}
181+
182+
public void RenderRawVideoFrame(RawVideoFrame rawVideoFrame)
183+
{
184+
var videoFrameBuffer = rawVideoFrame as RawVideoFrameBuffer;
185+
pendingIncomingFrames.Enqueue(new PendingFrame() {
186+
frame = rawVideoFrame,
187+
kind = RawVideoFrameKind.Buffer });
188+
}
189+
```
190+
191+
8. It is highly recommended to manage both incoming and outgoing video frames through a buffering mechanism to avoid overload the `MonoBehaviour.Update()` call back method, which should be kept light and avoid CPU or network heavy duties and ensure a smoother video experience. This optional optimization is left to developers to decide what works the best in theirs scenarios.
192+
193+
Here is sample of how the incoming frames can be rendered to a Unity `VideoTexture` by calling `Graphics.Blit` out of an internal queue:
194+
```csharp
195+
private void Update()
196+
{
197+
if (pendingIncomingFrames.TryDequeue(out PendingFrame pendingFrame))
198+
{
199+
switch (pendingFrame.kind)
200+
{
201+
case RawVideoFrameKind.Buffer:
202+
var videoFrameBuffer = pendingFrame.frame as RawVideoFrameBuffer;
203+
VideoStreamFormat videoFormat = videoFrameBuffer.StreamFormat;
204+
int width = videoFormat.Width;
205+
int height = videoFormat.Height;
206+
var texture = new Texture2D(width, height, TextureFormat.RGBA32, mipChain: false);
207+
208+
var buffers = videoFrameBuffer.Buffers;
209+
NativeBuffer buffer = buffers.Count > 0 ? buffers[0] : null;
210+
buffer.GetData(out IntPtr bytes, out int signedSize);
211+
212+
texture.LoadRawTextureData(bytes, signedSize);
213+
texture.Apply();
214+
215+
Graphics.Blit(source: texture, dest: rawIncomingVideoRenderTexture);
216+
break;
217+
218+
case RawVideoFrameKind.Texture:
219+
break;
220+
}
221+
pendingFrame.frame.Dispose();
222+
}
223+
}
224+
```

0 commit comments

Comments
 (0)