|
| 1 | +--- |
| 2 | +title: Tutorial - Add Augmented Reality filter to your app |
| 3 | +titleSuffix: An Azure Communication Services tutorial |
| 4 | +description: In this tutorial, you learn how to add Augmented Reality filter to your app using Azure Communication Services and other effects SDKs. |
| 5 | +author: sloanster |
| 6 | +services: azure-communication-services |
| 7 | + |
| 8 | +ms.author: micahvivion |
| 9 | +ms.date: 01/15/2024 |
| 10 | +ms.topic: tutorial |
| 11 | +ms.service: azure-communication-services |
| 12 | +ms.subservice: calling |
| 13 | +ms.custom: mode-other |
| 14 | +--- |
| 15 | + |
| 16 | +# Tutorial: How to add Augmented Reality filters to your video calls |
| 17 | + |
| 18 | +> [!NOTE] |
| 19 | +> DeepAR SDK is third-party software which is licensed under its own terms. Microsoft does not make any representations or warranties concerning the use of third-party software. |
| 20 | +
|
| 21 | +In some usage scenarios, you may want to apply some video processing to the original camera video, such as background blur or background replacement. |
| 22 | +This can provide a better user experience. |
| 23 | +The Azure Communication Calling video effects package provides several video processing functions. However, this isn't the only choice. |
| 24 | +You can also integrate other video effects library with ACS raw media access API. |
| 25 | + |
| 26 | +We'll use DeepAR SDK(https://www.deepar.ai/) as an example to show how to integrate other effects libraries with ACS Calling SDK. |
| 27 | +Let's try DeepAR to enrich your video with Augmented Reality! |
| 28 | + |
| 29 | +## Prerequisites |
| 30 | + |
| 31 | +- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F). |
| 32 | +- An Azure Communication Service resource. Further details can be found in the [Create an Azure Communication Services resource](../quickstarts/create-communication-resource.md) quickstart. |
| 33 | +- An Azure Communication Services voice and video calling enabled client. [Add video calling to your app](../quickstarts/voice-video-calling/get-started-with-video-calling.md?pivots=platform-web). |
| 34 | +- DeepAR license key. [Getting started | DeepAR](https://docs.deepar.ai/deepar-sdk/platforms/web/getting-started). |
| 35 | + |
| 36 | +## How video input and output work between ACS Web SDK and DeepAR |
| 37 | +Both ACS Web SDK and DeepAR can read the camera device list and get the video stream directly from the device. |
| 38 | +We want to provide consistency in the app, and DeepAR SDK provides a way for us to directly input a video stream acquired from ACS Web SDK. |
| 39 | +Similarly, ACS Web SDK needs the processed video stream output from DeepAR SDK and sends this video stream to the remote endpoint. |
| 40 | +DeepAR offers the option to use a canvas as an output. ACS Web SDK can consume the raw video stream captured from the canvas. |
| 41 | + |
| 42 | +Here's the data flow: |
| 43 | + |
| 44 | +:::image type="content" source="./media/ar/videoflow.png" alt-text="The diagram of data flow between ACS SDK and DeepAR SDK."::: |
| 45 | + |
| 46 | + |
| 47 | +## Initialize DeepAR SDK |
| 48 | + |
| 49 | +To enable DeepAR filters, you need to initialize DeepAR SDK, this can be done by invoking `deepar.initialize` API. |
| 50 | +```javascript |
| 51 | +const canvas = document.createElement('canvas'); |
| 52 | +const deepAR = await deepar.initialize({ |
| 53 | + licenseKey: 'YOUR_LICENSE_KEY', |
| 54 | + canvas: canvas, |
| 55 | + additionalOptions: { |
| 56 | + cameraConfig: { |
| 57 | + disableDefaultCamera: true |
| 58 | + } |
| 59 | + } |
| 60 | +}); |
| 61 | +``` |
| 62 | +Here we disable the default camera because we want ACS Web SDK to provide the source video stream. |
| 63 | +The canvas is required as this provides a way for ACS Web SDK to consume the video output from DeepAR SDK. |
| 64 | + |
| 65 | +## Connect the input and output |
| 66 | + |
| 67 | +To start a video call, you need to create a `LocalVideoStream` object as the video input in SDK. |
| 68 | +```javascript |
| 69 | +const deviceManager = await callClient.getDeviceManager(); |
| 70 | +const cameras = await deviceManager.getCameras(); |
| 71 | +const camera = cameras[0] |
| 72 | +const localVideoStream = new LocalVideoStream(camera); |
| 73 | +await call.startVideo(localVideoStream); |
| 74 | +``` |
| 75 | +By doing this, ACS SDK directly sends out the video from camera without processed by DeepAR. |
| 76 | +We need to create a path to forward the video acquired from ACS SDK to DeepAR SDK. |
| 77 | + |
| 78 | +```javascript |
| 79 | +const deviceManager = await callClient.getDeviceManager(); |
| 80 | +const cameras = await deviceManager.getCameras(); |
| 81 | +const camera = cameras[0] |
| 82 | +const inputVideoStream = new LocalVideoStream(camera); |
| 83 | +const inputMediaStream = await inputVideoStream.getMediaStream(); |
| 84 | +const video = document.createElement('video'); |
| 85 | +const videoResizeCallback = () => { |
| 86 | + canvas.width = video.videoWidth; |
| 87 | + canvas.height = video.videoHeight; |
| 88 | +}; |
| 89 | +video.addEventListener('resize', videoResizeCallback); |
| 90 | +video.autoplay = true; |
| 91 | +video.srcObject = inputMediaStream; |
| 92 | +deepAR.setVideoElement(video, true); |
| 93 | +``` |
| 94 | +Now we have finished configuring the input video. To configure the output video, we need another `LocalVideoStream`. |
| 95 | + |
| 96 | +```javascript |
| 97 | +const outputMediaStream = canvas.captureStream(30); |
| 98 | +const outputVideoStream = new LocalVideoStream(outputMediaStream); |
| 99 | +await call.startVideo(outputVideoStream); |
| 100 | +``` |
| 101 | + |
| 102 | +## Start the effect |
| 103 | + |
| 104 | +In DeepAR, effects and background processing are independent, which means you can apply the filter while enabling the background blur or background replacement. |
| 105 | +```javascript |
| 106 | +// apply the effect |
| 107 | +await deepAR.switchEffect('https://cdn.jsdelivr.net/npm/deepar/effects/lion'); |
| 108 | +// enable the background blur |
| 109 | +await deepAR.backgroundBlur(true, 8); |
| 110 | + |
| 111 | +``` |
| 112 | +:::image type="content" source="./media/ar/screenshot.png" alt-text="Screenshot of the video effect."::: |
| 113 | + |
| 114 | +## Stop the effect |
| 115 | + |
| 116 | +If you want to stop the effect, you can invoke `deepar.clearEffect` API |
| 117 | +```javascript |
| 118 | +await deepAR.clearEffect(); |
| 119 | +``` |
| 120 | +To disable the background blur, you can pass `false` to `deepar.backgroundBlur` API. |
| 121 | + |
| 122 | +## Disable DeepAR during the video call |
| 123 | + |
| 124 | +In case you want to disable DeepAR during the video call. |
| 125 | +You need to call `deepar.stopVideo`. |
| 126 | +Invoking `deepar.stopVideo` also ends the current media stream captured from the canvas. |
| 127 | + |
| 128 | +```javascript |
| 129 | +await outputVideoStream.switchSource(cameras[0]); |
| 130 | +await deepAR.stopVideo(); |
| 131 | +``` |
| 132 | + |
| 133 | +## Next steps |
| 134 | +For more information, see the following articles: |
| 135 | + |
| 136 | +- Learn about [Video effects](../quickstarts/voice-video-calling/get-started-video-effects.md?pivots=platform-web). |
| 137 | +- Learn more about [Manage video during calls](../how-tos/calling-sdk/manage-video.md?pivots=platform-web). |
| 138 | +- DeepAR documentation. [Getting started | DeepAR](https://docs.deepar.ai/deepar-sdk/platforms/web/getting-started). |
0 commit comments