Skip to content

Commit 2d12b09

Browse files
Update get-started-raw-media-access.md
1 parent 6b2e810 commit 2d12b09

File tree

1 file changed

+183
-86
lines changed

1 file changed

+183
-86
lines changed

articles/communication-services/quickstarts/voice-video-calling/get-started-raw-media-access.md

Lines changed: 183 additions & 86 deletions
Original file line numberDiff line numberDiff line change
@@ -2,10 +2,10 @@
22
title: Quickstart - Add RAW media access to your app (Android)
33
titleSuffix: An Azure Communication Services quickstart
44
description: In this quickstart, you'll learn how to add raw media access calling capabilities to your app using Azure Communication Services.
5-
author: LaithRodan
5+
author: Yassir Bisteni
66

7-
ms.author: larodan
8-
ms.date: 11/18/2021
7+
ms.author: yassirb
8+
ms.date: 04/19/2022
99
ms.topic: quickstart
1010
ms.service: azure-communication-services
1111
ms.subservice: calling
@@ -18,32 +18,20 @@ ms.custom: mode-other
1818

1919
In this quickstart, you'll learn how implement raw media access using the Azure Communication Services Calling SDK for Android.
2020

21-
## Outbound virtual video device
22-
2321
The Azure Communication Services Calling SDK offers APIs allowing apps to generate their own video frames to send to remote participants.
2422

2523
This quick start builds upon [QuickStart: Add 1:1 video calling to your app](./get-started-with-video-calling.md?pivots=platform-android) for Android.
2624

2725

28-
## Overview
29-
30-
Once an outbound virtual video device is created, use DeviceManager to make a new virtual video device that behaves just like any other webcam connected to your computer or mobile phone.
26+
## Virtual Video Stream Overview
3127

3228
Since the app will be generating the video frames, the app must inform the Azure Communication Services Calling SDK about the video formats the app is capable of generating. This is required to allow the Azure Communication Services Calling SDK to pick the best video format configuration given the network conditions at any giving time.
3329

3430
The app must register a delegate to get notified about when it should start or stop producing video frames. The delegate event will inform the app which video format is more appropriate for the current network conditions.
3531

36-
The following is an overview of the steps required to create an outbound virtual video device.
37-
38-
1. Create a `VirtualDeviceIdentification` with basic identification information for the new outbound virtual video device.
39-
40-
```java
41-
VirtualDeviceIdentification deviceId = new VirtualDeviceIdentification();
42-
deviceId.setId("QuickStartVirtualVideoDevice");
43-
deviceId.setName("My First Virtual Video Device");
44-
```
32+
The following is an overview of the steps required to create a virtual video stream.
4533

46-
2. Create an array of `VideoFormat` with the video formats supported by the app. It is fine to have only one video format supported, but at least one of the provided video formats must be of the `MediaFrameKind::VideoSoftware` type. When multiple formats are provided, the order of the format in the list does not influence or prioritize which one will be used. The selected format is based on external factors like network bandwidth.
34+
1. Create an array of `VideoFormat` with the video formats supported by the app. It is fine to have only one video format supported, but at least one of the provided video formats must be of the `VideoFrameKind::VideoSoftware` type. When multiple formats are provided, the order of the format in the list does not influence or prioritize which one will be used. The selected format is based on external factors like network bandwidth.
4735

4836
```java
4937
ArrayList<VideoFormat> videoFormats = new ArrayList<VideoFormat>();
@@ -52,121 +40,230 @@ The following is an overview of the steps required to create an outbound virtual
5240
format.setWidth(1280);
5341
format.setHeight(720);
5442
format.setPixelFormat(PixelFormat.RGBA);
55-
format.setMediaFrameKind(MediaFrameKind.VIDEO_SOFTWARE);
43+
format.setMediaFrameKind(VideoFrameKind.VIDEO_SOFTWARE);
5644
format.setFramesPerSecond(30);
5745
format.setStride1(1280 * 4); // It is times 4 because RGBA is a 32-bit format.
5846

5947
videoFormats.add(format);
6048
```
6149

62-
3. Create `OutboundVirtualVideoDeviceOptions` and set `DeviceIdentification` and `VideoFormats` with the previously created objects.
50+
2. Create `OutgoingVirtualVideoStreamOptions` and set `VideoFormats` with the previously created object.
6351

6452
```java
65-
OutboundVirtualVideoDeviceOptions m_options = new OutboundVirtualVideoDeviceOptions();
66-
67-
// ...
68-
69-
m_options.setDeviceIdentification(deviceId);
70-
m_options.setVideoFormats(videoFormats);
53+
OutgoingVirtualVideoStreamOptions options = new OutgoingVirtualVideoStreamOptions();
54+
options.setVideoFormats(videoFormats);
7155
```
7256

73-
4. Make sure the `OutboundVirtualVideoDeviceOptions::OnFlowChanged` delegate is defined. This delegate will inform its listener about events requiring the app to start or stop producing video frames. In this quick start, `m_mediaFrameSender` is used as trigger to let the app know when it's time to start generating frames. Feel free to use any mechanism in your app as a trigger.
57+
3. Subscribe to `OutgoingVirtualVideoStreamOptions::addOnOutgoingVideoStreamStateChangedListener` delegate. This delegate will inform the state of the current stream, its important that you do not send frames if the state is no equal to `OutgoingVideoStreamState.STARTED`.
7458

7559
```java
76-
private MediaFrameSender m_mediaFrameSender;
60+
private OutgoingVideoStreamState outgoingVideoStreamState;
7761

78-
// ...
62+
options.addOnOutgoingVideoStreamStateChangedListener(event -> {
7963

80-
m_options.addOnFlowChangedListener(virtualDeviceFlowControlArgs -> {
81-
if (virtualDeviceFlowControlArgs.getMediaFrameSender().getRunningState() == VirtualDeviceRunningState.STARTED) {
82-
// Tell the app's frame generator to start producing frames.
83-
m_mediaFrameSender = virtualDeviceFlowControlArgs.getMediaFrameSender();
84-
} else {
85-
// Tell the app's frame generator to stop producing frames.
86-
m_mediaFrameSender = null;
87-
}
64+
outgoingVideoStreamState = event.getOutgoingVideoStreamState();
8865
});
8966
```
9067

91-
5. Use `DeviceManager::CreateOutboundVirtualVideoDevice` to create an outbound virtual video device. The returning `OutboundVirtualVideoDevice` should be kept alive as long as the app needs to keep acting as a virtual video device. It is ok to register multiple outbound virtual video devices per app.
68+
4. Make sure the `OutgoingVirtualVideoStreamOptions::addOnVideoFrameSenderChangedListener` delegate is defined. This delegate will inform its listener about events requiring the app to start or stop producing video frames. In this quick start, `mediaFrameSender` is used as trigger to let the app know when it's time to start generating frames. Feel free to use any mechanism in your app as a trigger.
9269
9370
```java
94-
private OutboundVirtualVideoDevice m_outboundVirtualVideoDevice;
71+
private VideoFrameSender mediaFrameSender;
9572
96-
// ...
73+
options.addOnVideoFrameSenderChangedListener(event -> {
9774
98-
m_outboundVirtualVideoDevice = m_deviceManager.createOutboundVirtualVideoDevice(m_options).get();
75+
mediaFrameSender = event.getMediaFrameSender();
76+
});
9977
```
10078
101-
6. Tell device manager to use the recently created virtual camera on calls.
79+
5. Create an instance of `VirtualVideoStream` using the `OutgoingVirtualVideoStreamOptions` we created previously
10280
10381
```java
104-
private LocalVideoStream m_localVideoStream;
105-
106-
// ...
82+
private VirtualVideoStream virtualVideoStream;
10783
108-
for (VideoDeviceInfo videoDeviceInfo : m_deviceManager.getCameras())
109-
{
110-
String deviceId = videoDeviceInfo.getId();
111-
if (deviceId.equalsIgnoreCase("QuickStartVirtualVideoDevice")) // Same id used in step 1.
112-
{
113-
m_localVideoStream = LocalVideoStream(videoDeviceInfo, getApplicationContext());
114-
}
115-
}
84+
virtualVideoStream = new VirtualVideoStream(options);
11685
```
11786
118-
7. In a non-UI thread or loop in the app, cast the `MediaFrameSender` to the appropriate type defined by the `MediaFrameKind` property of `VideoFormat`. For example, cast it to `SoftwareBasedVideoFrame` and then call the `send` method according to the number of planes defined by the MediaFormat.
87+
7. Once outgoingVideoStreamState is equal to `OutgoingVideoStreamState.STARTED` create and instance of `FrameGenerator` class this will start a non-UI thread and will send frames, call `FrameGenerator.SetVideoFrameSender` each time we get an updated `VideoFrameSender` on the previous delegate, cast the `VideoFrameSender` to the appropriate type defined by the `VideoFrameKind` property of `VideoFormat`. For example, cast it to `SoftwareBasedVideoFrameSender` and then call the `send` method according to the number of planes defined by the MediaFormat.
11988
After that, create the ByteBuffer backing the video frame if needed. Then, update the content of the video frame. Finally, send the video frame to other participants with the `sendFrame` API.
12089
12190
```java
122-
java.nio.ByteBuffer plane1 = null;
123-
Random rand = new Random();
124-
byte greyValue = 0;
125-
126-
// ...
127-
java.nio.ByteBuffer plane1 = null;
128-
Random rand = new Random();
129-
130-
while (m_outboundVirtualVideoDevice != null) {
131-
while (m_mediaFrameSender != null) {
132-
if (m_mediaFrameSender.getMediaFrameKind() == MediaFrameKind.VIDEO_SOFTWARE) {
133-
SoftwareBasedVideoFrame sender = (SoftwareBasedVideoFrame) m_mediaFrameSender;
91+
public class FrameGenerator {
92+
93+
private VideoFrameSender videoFrameSender;
94+
private Thread frameIteratorThread;
95+
private final Random random;
96+
private volatile boolean stopFrameIterator = false;
97+
98+
public FrameGenerator() {
99+
100+
random = new Random();
101+
}
102+
103+
public void FrameIterator() {
104+
105+
ByteBuffer plane = null;
106+
while (!stopFrameIterator && videoFrameSender != null) {
107+
108+
plane = GenerateFrame(plane);
109+
}
110+
}
111+
112+
private ByteBuffer GenerateFrame(ByteBuffer plane)
113+
{
114+
try {
115+
116+
SoftwareBasedVideoFrameSender sender = (SoftwareBasedVideoFrameSender) videoFrameSender;
134117
VideoFormat videoFormat = sender.getVideoFormat();
118+
long timeStamp = sender.getTimestamp();
135119
136-
// Gets the timestamp for when the video frame has been created.
137-
// This allows better synchronization with audio.
138-
int timeStamp = sender.getTimestamp();
120+
if (plane == null || videoFormat.getStride1() * videoFormat.getHeight() != plane.capacity()) {
139121
140-
// Adjusts frame dimensions to the video format that network conditions can manage.
141-
if (plane1 == null || videoFormat.getStride1() * videoFormat.getHeight() != plane1.capacity()) {
142-
plane1 = ByteBuffer.allocateDirect(videoFormat.getStride1() * videoFormat.getHeight());
143-
plane1.order(ByteOrder.nativeOrder());
122+
plane = ByteBuffer.allocateDirect(videoFormat.getStride1() * videoFormat.getHeight());
123+
plane.order(ByteOrder.nativeOrder());
144124
}
145125
146-
// Generates random gray scaled bands as video frame.
147-
int bandsCount = rand.nextInt(15) + 1;
126+
int bandsCount = random.nextInt(15) + 1;
148127
int bandBegin = 0;
149128
int bandThickness = videoFormat.getHeight() * videoFormat.getStride1() / bandsCount;
150129
151130
for (int i = 0; i < bandsCount; ++i) {
152-
byte greyValue = (byte)rand.nextInt(254);
153-
java.util.Arrays.fill(plane1.array(), bandBegin, bandBegin + bandThickness, greyValue);
131+
132+
byte greyValue = (byte) random.nextInt(254);
133+
java.util.Arrays.fill(plane.array(), bandBegin, bandBegin + bandThickness, greyValue);
154134
bandBegin += bandThickness;
155135
}
156136
157-
// Sends video frame to the other participants in the call.
158-
FrameConfirmation fr = sender.sendFrame(plane1, timeStamp).get();
137+
FrameConfirmation fr = sender.sendFrame(plane, timeStamp).get();
159138
160-
// Waits before generating the next video frame.
161-
// Video format defines how many frames per second app must generate.
162139
Thread.sleep((long) (1000.0f / videoFormat.getFramesPerSecond()));
163140
}
141+
catch (InterruptedException ex) {
142+
143+
ex.printStackTrace();
144+
}
145+
catch (ExecutionException ex2)
146+
{
147+
ex2.getMessage();
148+
}
149+
150+
return plane;
164151
}
165152
166-
// Virtual camera hasn't been created yet.
167-
// Let's wait a little bit before checking again.
168-
// This is for demo only purposes.
169-
// Feel free to use a better synchronization mechanism.
170-
Thread.sleep(100);
153+
private void StartFrameIterator()
154+
{
155+
frameIteratorThread = new Thread(this::FrameIterator);
156+
frameIteratorThread.start();
157+
}
158+
159+
public void StopFrameIterator()
160+
{
161+
try
162+
{
163+
if (frameIteratorThread != null)
164+
{
165+
stopFrameIterator = true;
166+
frameIteratorThread.join();
167+
frameIteratorThread = null;
168+
stopFrameIterator = false;
169+
}
170+
}
171+
catch (InterruptedException ex)
172+
{
173+
ex.getMessage();
174+
}
175+
}
176+
177+
@Override
178+
public void SetVideoFrameSender(VideoFrameSender videoFramSender) {
179+
180+
StopFrameIterator();
181+
this.videoFrameSender = videoFramSender;
182+
StartFrameIterator();
183+
}
171184
}
172185
```
186+
187+
## Screen Share Video Stream Overview
188+
189+
Repeat steps `1 to 4` from the previous VirtualVideoStream tutorial.
190+
191+
Since the Android system generates the frames, you have to implement your own foreground service to capture the frames and send them through using our API
192+
193+
The following is an overview of the steps required to create a screen share video stream.
194+
195+
1. Add this permission to your `Manifest.xml` file inside your Android project
196+
197+
```xml
198+
<uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
199+
```
200+
201+
2. Create an instance of `ScreenShareVideoStream` using the `OutgoingVirtualVideoStreamOptions` we created previously
202+
203+
```java
204+
private ScreenShareVideoStream screenShareVideoStream;
205+
206+
screenShareVideoStream = new ScreenShareVideoStream(options);
207+
```
208+
209+
3. Request needed permissions for screen capture on Android, once this method is called Android will call automatically `onActivityResult` containing the request code we have sent and the result of the operation, expect `Activity.RESULT_OK` if the permission has been provided by the user if so attach the screenShareVideoStream to the call and start your own foreground service to capture the frames.
210+
211+
```java
212+
public void GetScreenSharePermissions() {
213+
214+
try {
215+
216+
MediaProjectionManager mediaProjectionManager = (MediaProjectionManager) getSystemService(Context.MEDIA_PROJECTION_SERVICE);
217+
startActivityForResult(mediaProjectionManager.createScreenCaptureIntent(), Constants.SCREEN_SHARE_REQUEST_INTENT_REQ_CODE);
218+
} catch (Exception e) {
219+
220+
String error = "Could not start screen share due to failure to startActivityForResult for mediaProjectionManager screenCaptureIntent";
221+
}
222+
}
223+
224+
@Override
225+
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
226+
227+
super.onActivityResult(requestCode, resultCode, data);
228+
229+
if (requestCode == Constants.SCREEN_SHARE_REQUEST_INTENT_REQ_CODE) {
230+
231+
if (resultCode == Activity.RESULT_OK && data != null) {
232+
233+
// Attach the screenShareVideoStream to the call
234+
// Start your foreground service
235+
} else {
236+
237+
String error = "user cancelled, did not give permission to capture screen";
238+
}
239+
}
240+
}
241+
```
242+
243+
4. Once you receive a frame on your foreground service send it through using the `VideoFrameSender` provided
244+
245+
````java
246+
public void onImageAvailable(ImageReader reader) {
247+
248+
Image image = reader.acquireLatestImage();
249+
if (image != null) {
250+
251+
final Image.Plane[] planes = image.getPlanes();
252+
if (planes.length > 0) {
253+
254+
Image.Plane plane = planes[0];
255+
final ByteBuffer buffer = plane.getBuffer();
256+
try {
257+
258+
SoftwareBasedVideoFrameSender sender = (SoftwareBasedVideoFrameSender) videoFrameSender;
259+
sender.sendFrame(buffer, sender.getTimestamp()).get();
260+
} catch (Exception ex) {
261+
262+
Log.d("MainActivity", "MainActivity.onImageAvailable trace, failed to send Frame");
263+
}
264+
}
265+
266+
image.close();
267+
}
268+
}
269+
````

0 commit comments

Comments
 (0)