Skip to content

Commit 6f65b61

Browse files
authored
Merge pull request #203672 from sharifrahaman/sharifrahaman-raw-media
Quick start - raw media access
2 parents 1c50792 + 67b2749 commit 6f65b61

File tree

4 files changed

+437
-291
lines changed

4 files changed

+437
-291
lines changed
Lines changed: 18 additions & 291 deletions
Original file line numberDiff line numberDiff line change
@@ -1,306 +1,33 @@
11
---
2-
title: Quickstart - Add RAW media access to your app (Android)
2+
title: Quickstart - Add RAW media access to your app
33
titleSuffix: An Azure Communication Services quickstart
44
description: In this quickstart, you'll learn how to add raw media access calling capabilities to your app using Azure Communication Services.
5-
author: yassirbisteni
5+
author: sharifrahaman
66

7-
ms.author: yassirb
8-
ms.date: 06/09/2022
7+
ms.author: srahaman
8+
ms.date: 06/30/2022
99
ms.topic: quickstart
1010
ms.service: azure-communication-services
1111
ms.subservice: calling
12+
zone_pivot_groups: acs-plat-android-web
1213
ms.custom: mode-other
1314
---
1415

15-
# Raw Video
16+
# QuickStart: Add raw media access to your app
1617

17-
[!INCLUDE [Public Preview](../../includes/public-preview-include-document.md)]
18+
::: zone pivot="platform-android"
19+
[!INCLUDE [Raw media with Android](./includes/raw-media/raw-media-access-android.md)]
20+
::: zone-end
1821

19-
In this quickstart, you'll learn how to implement raw media access using the Azure Communication Services Calling SDK for Android.
22+
::: zone pivot="platform-web"
23+
[!INCLUDE [Raw media with JavaScript](./includes/raw-media/raw-media-access-javascript.md)]
24+
::: zone-end
2025

21-
The Azure Communication Services Calling SDK offers APIs allowing apps to generate their own video frames to send to remote participants.
2226

23-
This quick start builds upon [QuickStart: Add 1:1 video calling to your app](./get-started-with-video-calling.md?pivots=platform-android) for Android.
27+
## Next steps
28+
For more information, see the following articles:
2429

25-
26-
## Virtual Video Stream Overview
27-
28-
Since the app will be generating the video frames, the app must inform the Azure Communication Services Calling SDK about the video formats the app is capable of generating. This is required to allow the Azure Communication Services Calling SDK to pick the best video format configuration given the network conditions at any giving time.
29-
30-
The app must register a delegate to get notified about when it should start or stop producing video frames. The delegate event will inform the app which video format is more appropriate for the current network conditions.
31-
32-
### Supported Video Resolutions
33-
34-
| Aspect Ratio | Resolution | Maximum FPS |
35-
| :--: | :-: | :-: |
36-
| 16x9 | 1080p | 30 |
37-
| 16x9 | 720p | 30 |
38-
| 16x9 | 540p | 30 |
39-
| 16x9 | 480p | 30 |
40-
| 16x9 | 360p | 30 |
41-
| 16x9 | 270p | 15 |
42-
| 16x9 | 240p | 15 |
43-
| 16x9 | 180p | 15 |
44-
| 4x3 | VGA (640x480) | 30 |
45-
| 4x3 | 424x320 | 15 |
46-
| 4x3 | QVGA (320x240) | 15 |
47-
| 4x3 | 212x160 | 15 |
48-
49-
The following is an overview of the steps required to create a virtual video stream.
50-
51-
1. Create an array of `VideoFormat` with the video formats supported by the app. It is fine to have only one video format supported, but at least one of the provided video formats must be of the `VideoFrameKind::VideoSoftware` type. When multiple formats are provided, the order of the format in the list doesn't influence or prioritize which one will be used. The selected format is based on external factors like network bandwidth.
52-
53-
```java
54-
ArrayList<VideoFormat> videoFormats = new ArrayList<VideoFormat>();
55-
56-
VideoFormat format = new VideoFormat();
57-
format.setWidth(1280);
58-
format.setHeight(720);
59-
format.setPixelFormat(PixelFormat.RGBA);
60-
format.setVideoFrameKind(VideoFrameKind.VIDEO_SOFTWARE);
61-
format.setFramesPerSecond(30);
62-
format.setStride1(1280 * 4); // It is times 4 because RGBA is a 32-bit format.
63-
64-
videoFormats.add(format);
65-
```
66-
67-
2. Create `RawOutgoingVideoStreamOptions` and set `VideoFormats` with the previously created object.
68-
69-
```java
70-
RawOutgoingVideoStreamOptions rawOutgoingVideoStreamOptions = new RawOutgoingVideoStreamOptions();
71-
rawOutgoingVideoStreamOptions.setVideoFormats(videoFormats);
72-
```
73-
74-
3. Subscribe to `RawOutgoingVideoStreamOptions::addOnOutgoingVideoStreamStateChangedListener` delegate. This delegate will inform the state of the current stream, it's important that you don't send frames if the state is no equal to `OutgoingVideoStreamState.STARTED`.
75-
76-
```java
77-
private OutgoingVideoStreamState outgoingVideoStreamState;
78-
79-
rawOutgoingVideoStreamOptions.addOnOutgoingVideoStreamStateChangedListener(event -> {
80-
81-
outgoingVideoStreamState = event.getOutgoingVideoStreamState();
82-
});
83-
```
84-
85-
4. Make sure the `RawOutgoingVideoStreamOptions::addOnVideoFrameSenderChangedListener` delegate is defined. This delegate will inform its listener about events requiring the app to start or stop producing video frames. In this quick start, `mediaFrameSender` is used as trigger to let the app know when it's time to start generating frames. Feel free to use any mechanism in your app as a trigger.
86-
87-
```java
88-
private VideoFrameSender mediaFrameSender;
89-
90-
rawOutgoingVideoStreamOptions.addOnVideoFrameSenderChangedListener(event -> {
91-
92-
mediaFrameSender = event.getVideoFrameSender();
93-
});
94-
```
95-
96-
5. Create an instance of `VirtualRawOutgoingVideoStream` using the `RawOutgoingVideoStreamOptions` we created previously
97-
98-
```java
99-
private VirtualRawOutgoingVideoStream virtualRawOutgoingVideoStream;
100-
101-
virtualRawOutgoingVideoStream = new VirtualRawOutgoingVideoStream(rawOutgoingVideoStreamOptions);
102-
```
103-
104-
7. Once outgoingVideoStreamState is equal to `OutgoingVideoStreamState.STARTED` create and instance of `FrameGenerator` class this will start a non-UI thread and will send frames, call `FrameGenerator.SetVideoFrameSender` each time we get an updated `VideoFrameSender` on the previous delegate, cast the `VideoFrameSender` to the appropriate type defined by the `VideoFrameKind` property of `VideoFormat`. For example, cast it to `SoftwareBasedVideoFrameSender` and then call the `send` method according to the number of planes defined by the VideoFormat.
105-
After that, create the ByteBuffer backing the video frame if needed. Then, update the content of the video frame. Finally, send the video frame to other participants with the `sendFrame` API.
106-
107-
```java
108-
public class FrameGenerator implements VideoFrameSenderChangedListener {
109-
110-
private VideoFrameSender videoFrameSender;
111-
private Thread frameIteratorThread;
112-
private final Random random;
113-
private volatile boolean stopFrameIterator = false;
114-
115-
public FrameGenerator() {
116-
117-
random = new Random();
118-
}
119-
120-
public void FrameIterator() {
121-
122-
ByteBuffer plane = null;
123-
while (!stopFrameIterator && videoFrameSender != null) {
124-
125-
plane = GenerateFrame(plane);
126-
}
127-
}
128-
129-
private ByteBuffer GenerateFrame(ByteBuffer plane) {
130-
131-
try {
132-
133-
VideoFormat videoFormat = videoFrameSender.getVideoFormat();
134-
if (plane == null || videoFormat.getStride1() * videoFormat.getHeight() != plane.capacity()) {
135-
136-
plane = ByteBuffer.allocateDirect(videoFormat.getStride1() * videoFormat.getHeight());
137-
plane.order(ByteOrder.nativeOrder());
138-
}
139-
140-
int bandsCount = random.nextInt(15) + 1;
141-
int bandBegin = 0;
142-
int bandThickness = videoFormat.getHeight() * videoFormat.getStride1() / bandsCount;
143-
144-
for (int i = 0; i < bandsCount; ++i) {
145-
146-
byte greyValue = (byte) random.nextInt(254);
147-
java.util.Arrays.fill(plane.array(), bandBegin, bandBegin + bandThickness, greyValue);
148-
bandBegin += bandThickness;
149-
}
150-
151-
if (videoFrameSender instanceof SoftwareBasedVideoFrameSender) {
152-
SoftwareBasedVideoFrameSender sender = (SoftwareBasedVideoFrameSender) videoFrameSender;
153-
154-
long timeStamp = sender.getTimestampInTicks();
155-
sender.sendFrame(plane, timeStamp).get();
156-
} else {
157-
158-
HardwareBasedVideoFrameSender sender = (HardwareBasedVideoFrameSender) videoFrameSender;
159-
160-
int[] textureIds = new int[1];
161-
int targetId = GLES20.GL_TEXTURE_2D;
162-
163-
GLES20.glEnable(targetId);
164-
GLES20.glGenTextures(1, textureIds, 0);
165-
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
166-
GLES20.glBindTexture(targetId, textureIds[0]);
167-
GLES20.glTexImage2D(targetId,
168-
0,
169-
GLES20.GL_RGB,
170-
videoFormat.getWidth(),
171-
videoFormat.getHeight(),
172-
0,
173-
GLES20.GL_RGB,
174-
GLES20.GL_UNSIGNED_BYTE,
175-
plane);
176-
177-
long timeStamp = sender.getTimestampInTicks();
178-
sender.sendFrame(targetId, textureIds[0], timeStamp).get();
179-
}
180-
181-
Thread.sleep((long) (1000.0f / videoFormat.getFramesPerSecond()));
182-
} catch (InterruptedException ex) {
183-
184-
Log.d("FrameGenerator", String.format("FrameGenerator.GenerateFrame, %s", ex.getMessage()));
185-
} catch (ExecutionException ex2) {
186-
187-
Log.d("FrameGenerator", String.format("FrameGenerator.GenerateFrame, %s", ex2.getMessage()));
188-
}
189-
190-
return plane;
191-
}
192-
193-
private void StartFrameIterator() {
194-
195-
frameIteratorThread = new Thread(this::FrameIterator);
196-
frameIteratorThread.start();
197-
}
198-
199-
public void StopFrameIterator() {
200-
201-
try {
202-
203-
if (frameIteratorThread != null) {
204-
205-
stopFrameIterator = true;
206-
frameIteratorThread.join();
207-
frameIteratorThread = null;
208-
stopFrameIterator = false;
209-
}
210-
} catch (InterruptedException ex) {
211-
212-
Log.d("FrameGenerator", String.format("FrameGenerator.StopFrameIterator, %s", ex.getMessage()));
213-
}
214-
}
215-
```
216-
217-
## Screen Share Video Stream Overview
218-
219-
Repeat steps `1 to 4` from the previous VirtualRawOutgoingVideoStream tutorial.
220-
221-
Since the Android system generates the frames, you must implement your own foreground service to capture the frames and send them through using our Azure Communication Services Calling API
222-
223-
### Supported Video Resolutions
224-
225-
| Aspect Ratio | Resolution | Maximum FPS |
226-
| :--: | :-: | :-: |
227-
| Anything | Anything | 30 |
228-
229-
The following is an overview of the steps required to create a screen share video stream.
230-
231-
1. Add this permission to your `Manifest.xml` file inside your Android project
232-
233-
```xml
234-
<uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
235-
```
236-
237-
2. Create an instance of `ScreenShareRawOutgoingVideoStream` using the `RawOutgoingVideoStreamOptions` we created previously
238-
239-
```java
240-
private ScreenShareRawOutgoingVideoStream screenShareRawOutgoingVideoStream;
241-
242-
screenShareRawOutgoingVideoStream = new ScreenShareRawOutgoingVideoStream(rawOutgoingVideoStreamOptions);
243-
```
244-
245-
3. Request needed permissions for screen capture on Android, once this method is called Android will call automatically `onActivityResult` containing the request code we've sent and the result of the operation, expect `Activity.RESULT_OK` if the permission has been provided by the user if so attach the screenShareRawOutgoingVideoStream to the call and start your own foreground service to capture the frames.
246-
247-
```java
248-
public void GetScreenSharePermissions() {
249-
250-
try {
251-
252-
MediaProjectionManager mediaProjectionManager = (MediaProjectionManager) getSystemService(Context.MEDIA_PROJECTION_SERVICE);
253-
startActivityForResult(mediaProjectionManager.createScreenCaptureIntent(), Constants.SCREEN_SHARE_REQUEST_INTENT_REQ_CODE);
254-
} catch (Exception e) {
255-
256-
String error = "Could not start screen share due to failure to startActivityForResult for mediaProjectionManager screenCaptureIntent";
257-
Log.d("FrameGenerator", error);
258-
}
259-
}
260-
261-
@Override
262-
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
263-
264-
super.onActivityResult(requestCode, resultCode, data);
265-
266-
if (requestCode == Constants.SCREEN_SHARE_REQUEST_INTENT_REQ_CODE) {
267-
268-
if (resultCode == Activity.RESULT_OK && data != null) {
269-
270-
// Attach the screenShareRawOutgoingVideoStream to the call
271-
// Start your foreground service
272-
} else {
273-
274-
String error = "user cancelled, did not give permission to capture screen";
275-
}
276-
}
277-
}
278-
```
279-
280-
4. Once you receive a frame on your foreground service send it through using the `VideoFrameSender` provided
281-
282-
````java
283-
public void onImageAvailable(ImageReader reader) {
284-
285-
Image image = reader.acquireLatestImage();
286-
if (image != null) {
287-
288-
final Image.Plane[] planes = image.getPlanes();
289-
if (planes.length > 0) {
290-
291-
Image.Plane plane = planes[0];
292-
final ByteBuffer buffer = plane.getBuffer();
293-
try {
294-
295-
SoftwareBasedVideoFrameSender sender = (SoftwareBasedVideoFrameSender) videoFrameSender;
296-
sender.sendFrame(buffer, sender.getTimestamp()).get();
297-
} catch (Exception ex) {
298-
299-
Log.d("MainActivity", "MainActivity.onImageAvailable trace, failed to send Frame");
300-
}
301-
}
302-
303-
image.close();
304-
}
305-
}
306-
````
30+
- Check out our [calling hero sample](../../samples/calling-hero-sample.md)
31+
- Get started with the [UI Library](https://aka.ms/acsstorybook)
32+
- Learn about [Calling SDK capabilities](./getting-started-with-calling.md?pivots=platform-web)
33+
- Learn more about [how calling works](../../concepts/voice-video-calling/about-call-types.md)

0 commit comments

Comments
 (0)