|
| 1 | +--- |
| 2 | +title: Quickstart - Add RAW media access to your app (Android) |
| 3 | +titleSuffix: An Azure Communication Services quickstart |
| 4 | +description: In this quickstart, you'll learn how to add raw media access calling capabilities to your app using Azure Communication Services. |
| 5 | +author: LaithRodan |
| 6 | + |
| 7 | +ms.author: larodan |
| 8 | +ms.date: 11/18/2021 |
| 9 | +ms.topic: quickstart |
| 10 | +ms.service: azure-communication-services |
| 11 | +ms.subservice: calling |
| 12 | + |
| 13 | + |
| 14 | +--- |
| 15 | + |
| 16 | +# Raw media access |
| 17 | + |
| 18 | +[!INCLUDE [Public Preview](../../includes/public-preview-include-document.md)] |
| 19 | + |
| 20 | +In this quickstart, you'll learn how implement raw media access using the Azure Communication Services Calling SDK for Android. |
| 21 | + |
| 22 | +## Outbound virtual video device |
| 23 | + |
| 24 | +The ACS Calling SDK offers APIs allowing apps to generate their own video frames to send to remote participants. |
| 25 | + |
| 26 | +This quick start builds upon [QuickStart: Add 1:1 video calling to your app](./get-started-with-video-calling.md?pivots=platform-android) for Android. |
| 27 | + |
| 28 | + |
| 29 | +## Overview |
| 30 | + |
| 31 | +Once an outbound virtual video device is created, use DeviceManager to make a new virtual video device that behaves just like any other webcam connected to your computer or mobile phone. |
| 32 | + |
| 33 | +Since the app will be generating the video frames, the app must inform the ACS Calling SDK about the video formats the app is capable of generating. This is required to allow the ACS Calling SDK to pick the best video format configuration given the network conditions at any giving time. |
| 34 | + |
| 35 | +The app must register a delegate to get notified about when it should start or stop producing video frames. The delegate event will inform the app which video format is more appropriate for the current network conditions. |
| 36 | + |
| 37 | +The following is an overview of the steps required to create an outbound virtual video device. |
| 38 | + |
| 39 | +1. Create a `VirtualDeviceIdentification` with basic identification information for the new outbound virtual video device. |
| 40 | + |
| 41 | + ```java |
| 42 | + VirtualDeviceIdentification deviceId = new VirtualDeviceIdentification(); |
| 43 | + deviceId.setId("QuickStartVirtualVideoDevice"); |
| 44 | + deviceId.setName("My First Virtual Video Device"); |
| 45 | + ``` |
| 46 | + |
| 47 | +2. Create an array of `VideoFormat` with the video formats supported by the app. It is fine to have only one video format supported, but at least one of the provided video formats must be of the `MediaFrameKind::VideoSoftware` type. When multiple formats are provided, the order of the format in the list does not influence or prioritize which one will be used. The selected format is based on external factors like network bandwidth. |
| 48 | + |
| 49 | + ```java |
| 50 | + ArrayList<VideoFormat> videoFormats = new ArrayList<VideoFormat>(); |
| 51 | + |
| 52 | + VideoFormat format = new VideoFormat(); |
| 53 | + format.setWidth(1280); |
| 54 | + format.setHeight(720); |
| 55 | + format.setPixelFormat(PixelFormat.RGBA); |
| 56 | + format.setMediaFrameKind(MediaFrameKind.VIDEO_SOFTWARE); |
| 57 | + format.setFramesPerSecond(30); |
| 58 | + format.setStride1(1280 * 4); // It is times 4 because RGBA is a 32-bit format. |
| 59 | + |
| 60 | + videoFormats.add(format); |
| 61 | + ``` |
| 62 | + |
| 63 | +3. Create `OutboundVirtualVideoDeviceOptions` and set `DeviceIdentification` and `VideoFormats` with the previously created objects. |
| 64 | + |
| 65 | + ```java |
| 66 | + OutboundVirtualVideoDeviceOptions m_options = new OutboundVirtualVideoDeviceOptions(); |
| 67 | + |
| 68 | + // ... |
| 69 | + |
| 70 | + m_options.setDeviceIdentification(deviceId); |
| 71 | + m_options.setVideoFormats(videoFormats); |
| 72 | + ``` |
| 73 | + |
| 74 | +4. Make sure the `OutboundVirtualVideoDeviceOptions::OnFlowChanged` delegate is defined. This delegate will inform its listener about events requiring the app to start or stop producing video frames. In this quick start, `m_mediaFrameSender` is used as trigger to let the app know when it's time to start generating frames. Feel free to use any mechanism in your app as a trigger. |
| 75 | +
|
| 76 | + ```java |
| 77 | + private MediaFrameSender m_mediaFrameSender; |
| 78 | +
|
| 79 | + // ... |
| 80 | +
|
| 81 | + m_options.addOnFlowChangedListener(virtualDeviceFlowControlArgs -> { |
| 82 | + if (virtualDeviceFlowControlArgs.getMediaFrameSender().getRunningState() == VirtualDeviceRunningState.STARTED) { |
| 83 | + // Tell the app's frame generator to start producing frames. |
| 84 | + m_mediaFrameSender = virtualDeviceFlowControlArgs.getMediaFrameSender(); |
| 85 | + } else { |
| 86 | + // Tell the app's frame generator to stop producing frames. |
| 87 | + m_mediaFrameSender = null; |
| 88 | + } |
| 89 | + }); |
| 90 | + ``` |
| 91 | + |
| 92 | +5. Use `DeviceManager::CreateOutboundVirtualVideoDevice` to create an outbound virtual video device. The returning `OutboundVirtualVideoDevice` should be kept alive as long as the app needs to keep acting as a virtual video device. It is ok to register multiple outbound virtual video devices per app. |
| 93 | + |
| 94 | + ```java |
| 95 | + private OutboundVirtualVideoDevice m_outboundVirtualVideoDevice; |
| 96 | + |
| 97 | + // ... |
| 98 | + |
| 99 | + m_outboundVirtualVideoDevice = m_deviceManager.createOutboundVirtualVideoDevice(m_options).get(); |
| 100 | + ``` |
| 101 | + |
| 102 | +6. Tell device manager to use the recently created virtual camera on calls. |
| 103 | + |
| 104 | + ```java |
| 105 | + private LocalVideoStream m_localVideoStream; |
| 106 | + |
| 107 | + // ... |
| 108 | + |
| 109 | + for (VideoDeviceInfo videoDeviceInfo : m_deviceManager.getCameras()) |
| 110 | + { |
| 111 | + String deviceId = videoDeviceInfo.getId(); |
| 112 | + if (deviceId.equalsIgnoreCase("QuickStartVirtualVideoDevice")) // Same id used in step 1. |
| 113 | + { |
| 114 | + m_localVideoStream = LocalVideoStream(videoDeviceInfo, getApplicationContext()); |
| 115 | + } |
| 116 | + } |
| 117 | + ``` |
| 118 | + |
| 119 | +7. In a non-UI thread or loop in the app, cast the `MediaFrameSender` to the appropriate type defined by the `MediaFrameKind` property of `VideoFormat`. For example, cast it to `SoftwareBasedVideoFrame` and then call the `send` method according to the number of planes defined by the MediaFormat. |
| 120 | +After that, create the ByteBuffer backing the video frame if needed. Then, update the content of the video frame. Finally, send the video frame to other participants with the `sendFrame` API. |
| 121 | + |
| 122 | + ```java |
| 123 | + java.nio.ByteBuffer plane1 = null; |
| 124 | + Random rand = new Random(); |
| 125 | + byte greyValue = 0; |
| 126 | + |
| 127 | + // ... |
| 128 | + java.nio.ByteBuffer plane1 = null; |
| 129 | + Random rand = new Random(); |
| 130 | + |
| 131 | + while (m_outboundVirtualVideoDevice != null) { |
| 132 | + while (m_mediaFrameSender != null) { |
| 133 | + if (m_mediaFrameSender.getMediaFrameKind() == MediaFrameKind.VIDEO_SOFTWARE) { |
| 134 | + SoftwareBasedVideoFrame sender = (SoftwareBasedVideoFrame) m_mediaFrameSender; |
| 135 | + VideoFormat videoFormat = sender.getVideoFormat(); |
| 136 | + |
| 137 | + // Gets the timestamp for when the video frame has been created. |
| 138 | + // This allows better synchronization with audio. |
| 139 | + int timeStamp = sender.getTimestamp(); |
| 140 | + |
| 141 | + // Adjusts frame dimensions to the video format that network conditions can manage. |
| 142 | + if (plane1 == null || videoFormat.getStride1() * videoFormat.getHeight() != plane1.capacity()) { |
| 143 | + plane1 = ByteBuffer.allocateDirect(videoFormat.getStride1() * videoFormat.getHeight()); |
| 144 | + plane1.order(ByteOrder.nativeOrder()); |
| 145 | + } |
| 146 | + |
| 147 | + // Generates random gray scaled bands as video frame. |
| 148 | + int bandsCount = rand.nextInt(15) + 1; |
| 149 | + int bandBegin = 0; |
| 150 | + int bandThickness = videoFormat.getHeight() * videoFormat.getStride1() / bandsCount; |
| 151 | + |
| 152 | + for (int i = 0; i < bandsCount; ++i) { |
| 153 | + byte greyValue = (byte)rand.nextInt(254); |
| 154 | + java.util.Arrays.fill(plane1.array(), bandBegin, bandBegin + bandThickness, greyValue); |
| 155 | + bandBegin += bandThickness; |
| 156 | + } |
| 157 | + |
| 158 | + // Sends video frame to the other participants in the call. |
| 159 | + FrameConfirmation fr = sender.sendFrame(plane1, timeStamp).get(); |
| 160 | + |
| 161 | + // Waits before generating the next video frame. |
| 162 | + // Video format defines how many frames per second app must generate. |
| 163 | + Thread.sleep((long) (1000.0f / videoFormat.getFramesPerSecond())); |
| 164 | + } |
| 165 | + } |
| 166 | + |
| 167 | + // Virtual camera hasn't been created yet. |
| 168 | + // Let's wait a little bit before checking again. |
| 169 | + // This is for demo only purposes. |
| 170 | + // Feel free to use a better synchronization mechanism. |
| 171 | + Thread.sleep(100); |
| 172 | + } |
| 173 | + ``` |
0 commit comments