Skip to content

Commit 0455866

Browse files
skydovestschellenbachtbarbugli
authored
[WIP] Livestream tutorials (#697)
* wip * wip * wip on livestreaming tutorial * minor edits * wip on docs * Refactor flow condition of the durationInMs * counts and startedAt * Implement liveDurationInMs * Add camera and microphone toggle buttons and improve designs * Update docs for cookbook - watching livestream --------- Co-authored-by: Thierry Schellenbach <thierryschellenbach@gmail.com> Co-authored-by: Tommaso Barbugli <tbarbugli@gmail.com>
1 parent 9c34fd9 commit 0455866

File tree

21 files changed

+731
-204
lines changed

21 files changed

+731
-204
lines changed

README.md

Lines changed: 19 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -104,30 +104,40 @@ Video roadmap and changelog is available [here](https://github.com/GetStream/pro
104104

105105
- [ ] Complete integration with the video demo flow
106106
- [ ] Finish usability testing with design team on chat integration (Jaewoong)
107-
- [ ] Ringing: Finish it, make testing easy and write docs for common changes (Daniel)
108-
- [ ] Enable ice restarts for publisher and subscriber
109-
- [ ] Livestream tutorial (depends on RTMP support) (Thierry)
107+
- [X] Ringing: Finish it, make testing easy and write docs for common changes (Daniel)
110108
- [ ] Bug: Screensharing on Firefox has some issues when rendering on android (Daniel)
109+
- [ ] Pagination on query members & query call endpoints (Daniel)
110+
- [ ] local version of audioLevel(s) for lower latency audio visualizations(Daniel)
111+
- [ ] Android SDK development.md cleanup (Daniel)
112+
- [ ] Livestream tutorial (depends on RTMP support) (Thierry)
111113
- [ ] Call Analytics stateflow (Thierry)
112-
- [ ] Pagination on query members & query channel endpoints (Daniel)
114+
- [ ] Enable ice restarts for publisher and subscriber
113115
- [ ] Test coverage
114116
- [ ] Testing on more devices
115-
- [ ] local version of audioLevel(s) for lower latency audio visualizations(Daniel)
116-
- [ ] Android SDK development.md cleanup (Daniel)
117117
- [ ] Logging is too verbose (rtc is very noisy), clean it up to focus on the essential for info and higher
118118

119119
### 0.4.0 milestone
120120

121121
- [ ] Upgrade to more recent versions of webrtc
122-
- [ ] Screensharing from mobile
123122
- [ ] Tap to focus
124-
- [ ] Camera controls
125123
- [ ] Picture of the video stream at highest resolution
126124
- [ ] Review foreground service vs backend for some things like screensharing etc
127125
- [ ] Audio & Video filters. Support (Daniel)
128126
- [ ] H264 workaround on Samsung 23 (see https://github.com/livekit/client-sdk-android/blob/main/livekit-android-sdk/src/main/java/io/livekit/android/webrtc/SimulcastVideoEncoderFactoryWrapper.kt#L34 and
129127
- https://github.com/react-native-webrtc/react-native-webrtc/issues/983#issuecomment-975624906)
130-
- [ ] Dynascale 2.0 (codecs, f resolution switches, resolution webrtc handling)
128+
- [ ] Dynascale 2.0
129+
130+
### 0.5.0 milestone
131+
132+
- [ ] Screensharing from mobile
133+
- [ ] Camera controls
134+
135+
### Dynascale 2.0
136+
137+
- currently we support selecting which of the 3 layers you want to send: f, h and q. in addition we should support:
138+
- changing the resolution of the f track
139+
- changing the codec that's used from VP8 to h264 or vice versa
140+
- detecting when webrtc changes the resolution of the f track, and notifying the server about it (if needed)
131141

132142
## 💼 We are hiring!
133143

docusaurus/docs/Android/02-tutorials/01-video-calling.mdx

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -18,15 +18,17 @@ This tutorial teaches you how to build Zoom/Whatsapp style video calling for you
1818
2. Select Phone & Tablet -> **Empty Activity**
1919
3. Name your project **VideoCall**.
2020

21-
Note that setup steps can vary slightly across Android Studio versions.
22-
If you run into trouble, make sure to use the latest version of Android Studio ([Flamingo](https://developer.android.com/studio/releases#android-studio-flamingo-|-2022.2.1-patch-2-may-2023) or higher).
21+
Note that this tutorial was written using Android Studio Giraffe. Setup steps can vary slightly across Android Studio versions.
22+
We recommend using Android Studio Giraffe or newer.
2323

2424
### Step 2 - Install the SDK & Setup the client
2525

26-
**Add the Video Compose SDK** and [Jetpack Compose](https://developer.android.com/jetpack/compose) dependencies to your app's `build.gradle.kts` file found in `app/build.gradle`.
26+
**Add the Video Compose SDK** and [Jetpack Compose](https://developer.android.com/jetpack/compose) dependencies to your app's `build.gradle.kts` file found in `app/build.gradle.kts`.
2727
If you're new to android, note that there are 2 `build.gradle` files, you want to open the `build.gradle` in the app folder.
2828

29-
```groovy
29+
30+
31+
```kotlin
3032
dependencies {
3133
// Stream Video Compose SDK
3234
implementation("io.getstream:stream-video-android-compose:0.2.0")

docusaurus/docs/Android/02-tutorials/02-audio-room.mdx

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -20,17 +20,16 @@ Time to get started building an audio-room for your app.
2020

2121
### Step 1 - Create a new project in Android Studio
2222

23-
This tutorial was written using [Android Studio Flamingo](https://developer.android.com/studio/releases#android-studio-flamingo-|-2022.2.1-patch-2-may-2023).
24-
Setup steps can vary slightly across Android Studio versions.
25-
If you run into trouble, make sure to use the latest version of Android Studio.
23+
Note that this tutorial was written using Android Studio Giraffe. Setup steps can vary slightly across Android Studio versions.
24+
We recommend using Android Studio Giraffe or newer.
2625

2726
1. Create a new project
2827
2. Select Phone & Tablet -> **Empty Activity**
2928
3. Name your project **AudioRoom**.
3029

3130
### Step 2 - Install the SDK & Setup the client
3231

33-
**Add the Video Compose SDK** and [Jetpack Compose](https://developer.android.com/jetpack/compose) dependencies to your app's `build.gradle.kts` file found in `app/build.gradle`.
32+
**Add the Video Compose SDK** and [Jetpack Compose](https://developer.android.com/jetpack/compose) dependencies to your app's `build.gradle.kts` file found in `app/build.gradle.kts`.
3433
If you're new to android, note that there are 2 `build.gradle` files, you want to open the `build.gradle` in the app folder.
3534

3635
```groovy

docusaurus/docs/Android/02-tutorials/03-livestream.mdx

Lines changed: 74 additions & 72 deletions
Original file line numberDiff line numberDiff line change
@@ -5,20 +5,13 @@ description: How to build a livestream experience using Stream's video SDKs
55

66
import { TokenSnippet } from '../../../shared/_tokenSnippet.jsx';
77

8-
:::danger
9-
10-
This tutorial is almost ready, but not quite finished yet
11-
:::
12-
13-
## Livestream Tutorial
14-
158
In this tutorial we'll quickly build a low-latency in-app livestreaming experience.
169
The livestream is broadcasted using Stream's edge network of servers around the world.
1710
We'll cover the following topics:
1811

1912
* Ultra low latency streaming
2013
* Multiple streams & co-hosts
21-
* RTMP in and Webrtc input
14+
* RTMP in and WebRTC input
2215
* Exporting to HLS
2316
* Reactions, custom events and chat
2417
* Recording & Transcriptions
@@ -27,25 +20,27 @@ Let's get started, if you have any questions or feedback be sure to let us know
2720

2821
### Step 1 - Create a new project in Android Studio
2922

30-
This tutorial was written using Android Studio Flamingo.
31-
Setup steps can vary slightly across Android Studio versions so if you run into trouble be sure to use the latest version of Android Studio.
23+
Note that this tutorial was written using Android Studio Giraffe. Setup steps can vary slightly across Android Studio versions.
24+
We recommend using Android Studio Giraffe or newer.
3225

3326
1. Create a new project
3427
2. Select Phone & Template -> **empty activity**
3528
3. Name your project **Livestream**.
3629

3730
### Step 2 - Install the SDK & Setup the client
3831

39-
**Add the video SDK** to your app's `build.gradle` file found in app/build.gradle.
32+
**Add the video SDK** to your app's `build.gradle.kts` file found in `app/build.gradle.kts`.
4033
If you're new to android note that there are 2 build.gradle files, you want to open the one in the app folder.
4134

42-
```groovy
35+
```kotlin
4336
dependencies {
44-
implementation "io.getstream:stream-video-android-compose:$stream_version"
37+
implementation("io.getstream:stream-video-android-compose:0.2.0")
38+
39+
...
4540
}
4641
```
4742

48-
This tutorial uses the compose version of the SDK. Stream also provides a core library without compose.
43+
This tutorial uses the compose version of the video SDK. Stream also provides a core library without compose.
4944

5045
### Step 3 - Broadcast a livestream from your phone
5146

@@ -110,7 +105,10 @@ Replace them now with the values shown below:
110105

111106
<TokenSnippet sampleApp='livestream' displayStyle='credentials' />
112107

113-
In the next step we setup the user:
108+
When you run the app now you'll see a text message saying: "TODO: render video".
109+
Before we get around to rendering the video let's review the code above.
110+
111+
In the first step we setup the user:
114112

115113
```kotlin
116114
val user = User(
@@ -119,15 +117,15 @@ val user = User(
119117
)
120118
```
121119

122-
If you don't have an authenticated user you can also use a guest or anonymous user. TODO DOCS
120+
If you don't have an authenticated user you can also use a guest or anonymous user.
123121
For most apps it's convenient to match your own system of users to grant and remove permissions.
124122

125123
Next we create the client:
126124

127125
```kotlin
128126
val client = StreamVideoBuilder(
129127
context = applicationContext,
130-
apiKey = "hd8szvscpxvd", // demo API key
128+
apiKey = "mmhfdzb5evj2", // demo API key
131129
geo = GEO.GlobalEdgeNetwork,
132130
user = user,
133131
token = userToken,
@@ -136,7 +134,9 @@ val client = StreamVideoBuilder(
136134

137135
You'll see the `userToken` variable. Your backend typically generates the user token on signup or login.
138136

139-
In the next step we create and join the call. The call object is used for video calls, audio rooms and livestreaming.
137+
The most important step to review is how we create the call.
138+
Stream uses the same call object for livestreaming, audio rooms and video calling.
139+
Have a look at the code snippet below:
140140

141141
```kotlin
142142
val call = client.call("livestream", callId)
@@ -149,13 +149,20 @@ lifecycleScope.launch {
149149
}
150150
```
151151

152-
`call.join(create = true)` is the simplest example.
153-
It's also possible to configure settings on the call or add co-hosts. TODO: Docs
152+
First call object is created by specifying the call type: "livestream" and the callId.
153+
The "livestream" call type is just a set a defaults that typically works well for a livestream.
154+
You can edit the features, permissions and settings in the dashboard.
155+
The dashboard also allows you to create new call types as needed.
156+
157+
Lastly, call.join(create = true) creates the call object on our servers and joins it.
158+
The moment you use call.join() the realtime transport for audio and video is started.
154159

155-
### Step 4 - Render the video
160+
Note that you can also add members to a call and assign them different roles. (See the [call creation docs](../03-guides/02-joining-creating-calls.mdx))
161+
162+
### Step 4 - Rendering the video
156163

157164
In this step we're going to build a UI for showing your local video with a button to start the livestream.
158-
This example uses compose, but you could also use our XML VideoRenderer.
165+
This example uses Compose, but you could also use our XML VideoRenderer.
159166

160167
In `MainActivity.kt` replace the `VideoTheme` with the following code.
161168

@@ -244,12 +251,15 @@ If you now run your app you should see an interface like this:
244251

245252
![Livestream](../assets/tutorial-livestream.png)
246253

247-
When you press **go live** your video will be transmitted.
248-
Press go live in the android app and click the link below to watch it in your browser.
254+
Stream uses a technology called SFU cascading to replicate your livestream over different SFUs around the world.
255+
This makes it possible to reach a large audience in realtime.
256+
257+
Now let's press "go live" in the android app and click the link below to watch the video in your browser.
249258

250259
<TokenSnippet sampleApp='livestream' displayStyle='join' />
251260

252-
Let's take a moment to review the compose code above `Call.state` exposes all the stateflow objects you need.
261+
Let's take a moment to review the Compose code above. Call.state exposes all the stateflow objects you need.
262+
253263
The most important ones are:
254264

255265
```
@@ -259,82 +269,74 @@ call.state.participants
259269

260270
The [participant state docs](../03-guides/03-call-and-participant-state.mdx) show all the available data.
261271

262-
The compose layout is vanilla compose other than **[VideoRenderer](../04-ui-components/02-video-renderer.mdx)**.
263-
`VideoRenderer` renders the video and a fallback. You can use it for both rendering the local video and remote video.
272+
The livestream layout is built using standard Compose. The [VideoRenderer](../04-ui-components/02-video-renderer.mdx) component is provided by Stream.
273+
VideoRenderer renders the video and a fallback. You can use it for rendering the local and remote video.
264274

265275
### Step 4 - (Optional) Publishing RTMP using OBS
266276

267277
The example above showed how to publish your phone's camera to the livestream.
268278
Almost all livestream software and hardware supports RTMPs.
269279
So let's see how to publish using RTMPs. Feel free to skip this step if you don't need to use RTMPs.
270280

271-
A. Console log the URL & Stream Key println(call.state.ingress.rtmp)
272-
273-
B. Open OBS and go to settings -> stream
274-
- Select "custom" service
275-
- Server: equal to the server URL from the console log
276-
- Stream key: equal to the stream key from the console log
277-
278-
Press start streaming. The RTMP stream will now show up in your call.
279-
Now that we've learned to publish using webrtc or RTMP let's talk about viewing the livestream.
281+
A. Log the URL & Stream Key
280282

281-
### Step 5 - Viewing a livestream (Webrtc)
283+
```kotlin
284+
val rtmp = call.state.ingress.rtmp
285+
Log.i("Tutorial", "RTMP url and streamingKey: $rtmp")
286+
```
282287

283-
Watching a livestream is basically a simplified version of the code we wrote in `MainActivity.kt`
288+
B. Open OBS and go to settings -> stream
284289

285-
* You don't need to request permissions or enable the camera
286-
*
290+
- Select "custom" service
291+
- Server: equal to the server URL from the log
292+
- Stream key: equal to the stream key from the log
287293

288-
To change the example above and just watch a livestream:
294+
Press start streaming in OBS. The RTMP stream will now show up in your call just like a regular video participant.
295+
Now that we've learned to publish using WebRTC or RTMP let's talk about watching the livestream.
289296

290-
```kotlin
291-
// remove this
292-
call.camera.enable()
293-
call.microphone.enable()
297+
### Step 5 - Viewing a livestream (WebRTC)
294298

295-
// and this
296-
LaunchCallPermissions(call = call)
297-
// on the UI side remove the button to go live
299+
Watching a livestream is even easier than broadcasting.
298300

299-
// and update this to use the remote video
300-
val me by call.state.me.collectAsState()
301-
val video = me?.video?.collectAsState()
302-
```
301+
Compared to the current code in in `MainActivity.kt` you:
303302

304-
Here's the update MainActivity for viewing a call
303+
* Don't need to request permissions or enable the camera
304+
* Don't render the local video, but instead render the remote video
305+
* Typically include some small UI elements like viewer count, a button to mute etc
305306

306-
```kotlin
307-
```
307+
The [docs on building a UI for watching a livestream](../05-ui-cookbook/15-watching-livestream.mdx) explain this in more detail.
308308

309309
### Step 6 - (Optional) Viewing a livestream with HLS
310310

311-
Another way to view a livestream is using HLS. HLS tends to have a 10 to 20 seconds delay, while the above webrtc approach only has a 100-200ms delay typically.
312-
The benefit that HLS has is that it buffers better under poor network conditions.
313-
So for apps where you expect your users to have poor network, and where a 10 second delay is ok, HLS can be a better option.
311+
Another way to watch a livestream is using HLS. HLS tends to have a 10 to 20 seconds delay, while the above WebRTC approach is realtime.
312+
The benefit that HLS offers is better buffering under poor network conditions.
313+
So HLS can be a good option when:
314314

315-
Let's show how to broadcast your call to HLS.
315+
* A 10-20 second delay is acceptable
316+
* Your users want to watch the Stream in poor network conditions
316317

317-
```kotlin
318-
call.startBroadcast()
319-
```
320-
321-
After starting the broadcast the HLS url can be found in the call state
318+
Let's show how to broadcast your call to HLS:
322319

323320
```kotlin
324-
call.state.egress.value?.hls
321+
call.startBroadcast()
322+
val hlsUrl = call.state.egress.value?.hls
323+
Log.i("Tutorial", "HLS url = $hlsUrl")
325324
```
326325

327-
You can view the HLS video feed using any open source HLS capable video player.
326+
You can view the HLS video feed using any HLS capable video player.
328327

329328
### 7 - Advanced Features
330329

330+
This tutorial covered broadcasting and watching a livestream.
331+
It also went into more details about HLS & RTMP-in.
332+
331333
There are several advanced features that can improve the livestreaming experience:
332334

333-
* ** Co-hosts ** You can add members to your livestream with elevated permissions. So you can have co-hosts, moderators etc.
334-
* ** Custom events ** You can use custom events on the call to share any additional data. Think about showing the score for a game, or any other realtime use case.
335-
* ** Reactions & Chat ** Users can react to the livestream, and you can add chat. This makes for a more engaging experience.
336-
* ** Notifications ** You can notify users via push notifications when the livestream starts
337-
* ** Recording ** The call recording functionality allows you to record the call with various options and layouts
335+
* ** [Co-hosts](../03-guides/02-joining-creating-calls.mdx) ** You can add members to your livestream with elevated permissions. So you can have co-hosts, moderators etc.
336+
* ** [Custom events](../03-guides/08-reactions-and-custom-events.mdx) ** You can use custom events on the call to share any additional data. Think about showing the score for a game, or any other realtime use case.
337+
* ** [Reactions & Chat](../03-guides/08-reactions-and-custom-events.mdx) ** Users can react to the livestream, and you can add chat. This makes for a more engaging experience.
338+
* ** [Notifications](../06-advanced/01-ringing.mdx) ** You can notify users via push notifications when the livestream starts
339+
* ** [Recording](../06-advanced/06-recording.mdx) ** The call recording functionality allows you to record the call with various options and layouts
338340

339341
### Recap
340342

@@ -344,7 +346,7 @@ Our team is also happy to review your UI designs and offer recommendations on ho
344346

345347
To recap what we've learned:
346348

347-
* Webrtc is optimal for latency, HLS is slower but buffers better for users with poor connections
349+
* WebRTC is optimal for latency, HLS is slower but buffers better for users with poor connections
348350
* You setup a call: (val call = client.call("livestream", callId))
349351
* The call type "livestream" controls which features are enabled and how permissions are setup
350352
* The livestream by default enables "backstage" mode. This allows you and your co-hosts to setup your mic and camera before allowing people in

docusaurus/docs/Android/05-ui-cookbook/01-overview.mdx

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -77,6 +77,9 @@ This cookbook aims to show you how to build your own UI elements for video calli
7777
<CookbookCard title="Reactions">
7878
<img src={require("../assets/cookbook/reactions.png").default} />
7979
</CookbookCard>
80+
<CookbookCard title="Watching a Livestream">
81+
<img src={require("../assets/cookbook/reactions.png").default} />
82+
</CookbookCard>
8083
</CookbookCardGrid>
8184
</div>
8285

0 commit comments

Comments
 (0)