|
| 1 | +--- |
| 2 | +title: Quickstart - Add RAW media access to your app (Windows) |
| 3 | +titleSuffix: An Azure Communication Services quickstart |
| 4 | +description: In this quickstart, you'll learn how to add raw media access calling capabilities to your app using Azure Communication Services. |
| 5 | +author: yassirbisteni |
| 6 | + |
| 7 | +ms.author: yassirb |
| 8 | +ms.date: 06/09/2022 |
| 9 | +ms.topic: quickstart |
| 10 | +ms.service: azure-communication-services |
| 11 | +ms.subservice: calling |
| 12 | +ms.custom: mode-other |
| 13 | +--- |
| 14 | + |
| 15 | +## Raw video |
| 16 | + |
| 17 | +[!INCLUDE [Public Preview](../../../../includes/public-preview-include-document.md)] |
| 18 | + |
| 19 | +In this quickstart, you'll learn how to implement raw media access using the Azure Communication Services Calling SDK for Windows. |
| 20 | + |
| 21 | +The Azure Communication Services Calling SDK offers APIs allowing apps to generate their own video frames to send to remote participants. |
| 22 | + |
| 23 | +This quick start builds upon [QuickStart: Add 1:1 video calling to your app](../../get-started-with-video-calling.md?pivots=platform-windows) for Windows. |
| 24 | + |
| 25 | + |
| 26 | +## Virtual video stream overview |
| 27 | + |
| 28 | +Since the app will be generating the video frames, the app must inform the Azure Communication Services Calling SDK about the video formats the app is capable of generating. This is required to allow the Azure Communication Services Calling SDK to pick the best video format configuration given the network conditions at any giving time. |
| 29 | + |
| 30 | +The app must register a delegate to get notified about when it should start or stop producing video frames. The delegate event will inform the app which video format is more appropriate for the current network conditions. |
| 31 | + |
| 32 | +### Supported video resolutions |
| 33 | + |
| 34 | +| Aspect Ratio | Resolution | Maximum FPS | |
| 35 | +| :--: | :-: | :-: | |
| 36 | +| 16x9 | 1080p | 30 | |
| 37 | +| 16x9 | 720p | 30 | |
| 38 | +| 16x9 | 540p | 30 | |
| 39 | +| 16x9 | 480p | 30 | |
| 40 | +| 16x9 | 360p | 30 | |
| 41 | +| 16x9 | 270p | 15 | |
| 42 | +| 16x9 | 240p | 15 | |
| 43 | +| 16x9 | 180p | 15 | |
| 44 | +| 4x3 | VGA (640x480) | 30 | |
| 45 | +| 4x3 | 424x320 | 15 | |
| 46 | +| 4x3 | QVGA (320x240) | 15 | |
| 47 | +| 4x3 | 212x160 | 15 | |
| 48 | + |
| 49 | +The following is an overview of the steps required to create a virtual video stream. |
| 50 | + |
| 51 | +1. Create an array of `VideoFormat` with the video formats supported by the app. It is fine to have only one video format supported, but at least one of the provided video formats must be of the `VideoFrameKind::VideoSoftware` type. When multiple formats are provided, the order of the format in the list doesn't influence or prioritize which one will be used. The selected format is based on external factors like network bandwidth. |
| 52 | + |
| 53 | + ```csharp |
| 54 | + var videoFormat = new VideoFormat |
| 55 | + { |
| 56 | + Width = 1280, |
| 57 | + Height = 720, |
| 58 | + PixelFormat = PixelFormat.Rgba, |
| 59 | + VideoFrameKind = VideoFrameKind.VideoSoftware, |
| 60 | + FramesPerSecond = 30, |
| 61 | + Stride1 = 1280 * 4 // It is times 4 because RGBA is a 32-bit format. |
| 62 | + }; |
| 63 | + |
| 64 | + VideoFormat[] videoFormats = { videoFormat }; |
| 65 | + ``` |
| 66 | + |
| 67 | +2. Create `RawOutgoingVideoStreamOptions` and set `VideoFormats` with the previously created object. |
| 68 | + |
| 69 | + ```csharp |
| 70 | + RawOutgoingVideoStreamOptions rawOutgoingVideoStreamOptions = new RawOutgoingVideoStreamOptions(); |
| 71 | + rawOutgoingVideoStreamOptions.SetVideoFormats(videoFormats); |
| 72 | + ``` |
| 73 | + |
| 74 | +3. Subscribe to `RawOutgoingVideoStreamOptions::addOnOutgoingVideoStreamStateChangedListener` delegate. This delegate will inform the state of the current stream, it's important that you don't send frames if the state is no equal to `OutgoingVideoStreamState.STARTED`. |
| 75 | + |
| 76 | + ```csharp |
| 77 | + private OutgoingVideoStreamState outgoingVideoStreamState; |
| 78 | + |
| 79 | + rawOutgoingVideoStreamOptions.OnOutgoingVideoStreamStateChanged += (object sender, OutgoingVideoStreamStateChangedEventArgs args) => |
| 80 | + { |
| 81 | + outgoingVideoStreamState = args.OutgoingVideoStreamState(); |
| 82 | + }; |
| 83 | + ``` |
| 84 | + |
| 85 | +4. Make sure the `RawOutgoingVideoStreamOptions::addOnVideoFrameSenderChangedListener` delegate is defined. This delegate will inform its listener about events requiring the app to start or stop producing video frames. In this quick start, `videoFrameSender` is used as trigger to let the app know when it's time to start generating frames. Feel free to use any mechanism in your app as a trigger. |
| 86 | + |
| 87 | + ```csharp |
| 88 | + private VideoFrameSender videoFrameSender; |
| 89 | + |
| 90 | + rawOutgoingVideoStreamOptions.OnVideoFrameSenderChanged += (object sender, VideoFrameSenderChangedEventArgs args) => |
| 91 | + { |
| 92 | + videoFrameSender = args.VideoFrameSender; |
| 93 | + }; |
| 94 | + ``` |
| 95 | + |
| 96 | +5. Create an instance of `VirtualRawOutgoingVideoStream` using the `RawOutgoingVideoStreamOptions` we created previously |
| 97 | + |
| 98 | + ```csharp |
| 99 | + private VirtualRawOutgoingVideoStream virtualRawOutgoingVideoStream; |
| 100 | + |
| 101 | + virtualRawOutgoingVideoStream = new VirtualRawOutgoingVideoStream(rawOutgoingVideoStreamOptions); |
| 102 | + ``` |
| 103 | + |
| 104 | +7. Once outgoingVideoStreamState is equal to `OutgoingVideoStreamState.STARTED` create and instance of `FrameGenerator` class this will start a non-UI thread and will send frames, call `FrameGenerator.SetVideoFrameSender` each time we get an updated `VideoFrameSender` on the previous delegate, cast the `VideoFrameSender` to the appropriate type defined by the `VideoFrameKind` property of `VideoFormat`. For example, cast it to `SoftwareBasedVideoFrameSender` and then call the `send` method according to the number of planes defined by the VideoFormat. |
| 105 | +After that, create the ByteBuffer backing the video frame if needed. Then, update the content of the video frame. Finally, send the video frame to other participants with the `sendFrame` API. |
| 106 | + |
| 107 | + ```csharp |
| 108 | + [ComImport] |
| 109 | + [Guid("5B0D3235-4DBA-4D44-865E-8F1D0E4FD04D")] |
| 110 | + [InterfaceType(ComInterfaceType.InterfaceIsIUnknown)] |
| 111 | + unsafe interface IMemoryBufferByteAccess |
| 112 | + { |
| 113 | + void GetBuffer(out byte* buffer, out uint capacity); |
| 114 | + } |
| 115 | + |
| 116 | + public class VideoFrameGenerator |
| 117 | + { |
| 118 | + private VideoFrameSender videoFrameSender; |
| 119 | + private Thread frameIteratorThread; |
| 120 | + private Random random; |
| 121 | + private volatile bool stopFrameIterator = false; |
| 122 | + |
| 123 | + public VideoFrameGenerator() |
| 124 | + { |
| 125 | + random = new Random(); |
| 126 | + } |
| 127 | + |
| 128 | + public void VideoFrameIterator() |
| 129 | + { |
| 130 | + while (!stopFrameIterator && videoFrameSender != null) |
| 131 | + { |
| 132 | + GenerateVideoFrame().Wait(); |
| 133 | + } |
| 134 | + } |
| 135 | + |
| 136 | + private async Task GenerateVideoFrame() |
| 137 | + { |
| 138 | + try |
| 139 | + { |
| 140 | + var softwareBasedVideoFrameSender = videoFrameSender as SoftwareBasedVideoFrameSender; |
| 141 | + VideoFormat videoFormat = softwareBasedVideoFrameSender.VideoFormat; |
| 142 | + uint bufferSize = (uint)(videoFormat.Width * videoFormat.Height) * 4; |
| 143 | + |
| 144 | + var memoryBuffer = new MemoryBuffer(bufferSize); |
| 145 | + IMemoryBufferReference memoryBufferReference = memoryBuffer.CreateReference(); |
| 146 | + var memoryBufferByteAccess = memoryBufferReference as IMemoryBufferByteAccess; |
| 147 | + int w = softwareBasedVideoFrameSender.VideoFormat.Width; |
| 148 | + int h = softwareBasedVideoFrameSender.VideoFormat.Height; |
| 149 | + |
| 150 | + unsafe |
| 151 | + { |
| 152 | + memoryBufferByteAccess.GetBuffer(out byte* destBytes, out uint destCapacity); |
| 153 | + |
| 154 | + byte r = (byte)random.Next(1, 255); |
| 155 | + byte g = (byte)random.Next(1, 255); |
| 156 | + byte b = (byte)random.Next(1, 255); |
| 157 | + |
| 158 | + for (int y = 0; y < h; ++y) |
| 159 | + { |
| 160 | + for (int x = 0; x < w; x += 4) |
| 161 | + { |
| 162 | + destBytes[(w * 4 * y) + x] = (byte)(y % b); |
| 163 | + destBytes[(w * 4 * y) + x + 1] = (byte)(y % g); |
| 164 | + destBytes[(w * 4 * y) + x + 2] = (byte)(y % r); |
| 165 | + destBytes[(w * 4 * y) + x + 3] = 0; |
| 166 | + } |
| 167 | + } |
| 168 | + } |
| 169 | + |
| 170 | + await softwareBasedVideoFrameSender.SendFrameAsync(memoryBuffer, videoFrameSender.TimestampInTicks); |
| 171 | + int delayBetweenFrames = (int)(1000.0 / softwareBasedVideoFrameSender.VideoFormat.FramesPerSecond); |
| 172 | + await Task.Delay(delayBetweenFrames); |
| 173 | + } |
| 174 | + catch (Exception) { } |
| 175 | + } |
| 176 | + |
| 177 | + private void Start() |
| 178 | + { |
| 179 | + frameIteratorThread = new Thread(VideoFrameIterator); |
| 180 | + frameIteratorThread.Start(); |
| 181 | + } |
| 182 | + |
| 183 | + public void Stop() |
| 184 | + { |
| 185 | + try |
| 186 | + { |
| 187 | + if (frameIteratorThread != null) |
| 188 | + { |
| 189 | + stopFrameIterator = true; |
| 190 | + frameIteratorThread.Join(); |
| 191 | + frameIteratorThread = null; |
| 192 | + stopFrameIterator = false; |
| 193 | + } |
| 194 | + } |
| 195 | + catch (Exception) { } |
| 196 | + } |
| 197 | + |
| 198 | + public void OnVideoFrameSenderChanged(object sender, VideoFrameSenderChangedEventArgs args) |
| 199 | + { |
| 200 | + Stop(); |
| 201 | + this.videoFrameSender = args.VideoFrameSender; |
| 202 | + Start(); |
| 203 | + } |
| 204 | + } |
| 205 | + ``` |
| 206 | + |
| 207 | +## Screen share video stream overview |
| 208 | + |
| 209 | +Repeat steps `1 to 4` from the previous VirtualRawOutgoingVideoStream tutorial. |
| 210 | + |
| 211 | +Since the Windows system generates the frames, you must implement your own foreground service to capture the frames and send them through using our Azure Communication Services Calling API |
| 212 | + |
| 213 | +### Supported video resolutions |
| 214 | + |
| 215 | +| Aspect Ratio | Resolution | Maximum FPS | |
| 216 | +| :--: | :-: | :-: | |
| 217 | +| Anything | Anything | 30 | |
| 218 | + |
| 219 | +The following is an overview of the steps required to create a screen share video stream. |
| 220 | + |
| 221 | +1. Create an instance of `ScreenShareRawOutgoingVideoStream` using the `RawOutgoingVideoStreamOptions` we created previously |
| 222 | + |
| 223 | + ```csharp |
| 224 | + private ScreenShareRawOutgoingVideoStream screenShareRawOutgoingVideoStream; |
| 225 | + |
| 226 | + screenShareRawOutgoingVideoStream = new ScreenShareRawOutgoingVideoStream(rawOutgoingVideoStreamOptions); |
| 227 | + ``` |
| 228 | + |
| 229 | +2. Capture the frames from the screen using Windows API's |
| 230 | + |
| 231 | + ```csharp |
| 232 | + MemoryBuffer memoryBuffer = // Fill it with the content you got from the Windows API's |
| 233 | + ``` |
| 234 | + |
| 235 | +3. Send the video frames in the following way |
| 236 | + |
| 237 | + ```csharp |
| 238 | + private async Task GenerateVideoFrame(MemoryBuffer memoryBuffer) |
| 239 | + { |
| 240 | + try |
| 241 | + { |
| 242 | + var softwareBasedVideoFrameSender = videoFrameSender as SoftwareBasedVideoFrameSender; |
| 243 | + VideoFormat videoFormat = softwareBasedVideoFrameSender.VideoFormat; |
| 244 | + |
| 245 | + await softwareBasedVideoFrameSender.SendFrameAsync(memoryBuffer, videoFrameSender.TimestampInTicks); |
| 246 | + int delayBetweenFrames = (int)(1000.0 / softwareBasedVideoFrameSender.VideoFormat.FramesPerSecond); |
| 247 | + await Task.Delay(delayBetweenFrames); |
| 248 | + } |
| 249 | + catch (Exception) { } |
| 250 | + } |
| 251 | + ``` |
0 commit comments