You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+19-9Lines changed: 19 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -104,30 +104,40 @@ Video roadmap and changelog is available [here](https://github.com/GetStream/pro
104
104
105
105
-[ ] Complete integration with the video demo flow
106
106
-[ ] Finish usability testing with design team on chat integration (Jaewoong)
107
-
-[ ] Ringing: Finish it, make testing easy and write docs for common changes (Daniel)
108
-
-[ ] Enable ice restarts for publisher and subscriber
109
-
-[ ] Livestream tutorial (depends on RTMP support) (Thierry)
107
+
-[X] Ringing: Finish it, make testing easy and write docs for common changes (Daniel)
110
108
-[ ] Bug: Screensharing on Firefox has some issues when rendering on android (Daniel)
109
+
-[ ] Pagination on query members & query call endpoints (Daniel)
110
+
-[ ] local version of audioLevel(s) for lower latency audio visualizations(Daniel)
111
+
-[ ] Android SDK development.md cleanup (Daniel)
112
+
-[ ] Livestream tutorial (depends on RTMP support) (Thierry)
111
113
-[ ] Call Analytics stateflow (Thierry)
112
-
-[ ]Pagination on query members & query channel endpoints (Daniel)
114
+
-[ ]Enable ice restarts for publisher and subscriber
113
115
-[ ] Test coverage
114
116
-[ ] Testing on more devices
115
-
-[ ] local version of audioLevel(s) for lower latency audio visualizations(Daniel)
116
-
-[ ] Android SDK development.md cleanup (Daniel)
117
117
-[ ] Logging is too verbose (rtc is very noisy), clean it up to focus on the essential for info and higher
118
118
119
119
### 0.4.0 milestone
120
120
121
121
-[ ] Upgrade to more recent versions of webrtc
122
-
-[ ] Screensharing from mobile
123
122
-[ ] Tap to focus
124
-
-[ ] Camera controls
125
123
-[ ] Picture of the video stream at highest resolution
126
124
-[ ] Review foreground service vs backend for some things like screensharing etc
127
125
-[ ] Audio & Video filters. Support (Daniel)
128
126
-[ ] H264 workaround on Samsung 23 (see https://github.com/livekit/client-sdk-android/blob/main/livekit-android-sdk/src/main/java/io/livekit/android/webrtc/SimulcastVideoEncoderFactoryWrapper.kt#L34 and
Copy file name to clipboardExpand all lines: docusaurus/docs/Android/02-tutorials/01-video-calling.mdx
+6-4Lines changed: 6 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -18,15 +18,17 @@ This tutorial teaches you how to build Zoom/Whatsapp style video calling for you
18
18
2. Select Phone & Tablet -> **Empty Activity**
19
19
3. Name your project **VideoCall**.
20
20
21
-
Note that setup steps can vary slightly across Android Studio versions.
22
-
If you run into trouble, make sure to use the latest version of Android Studio ([Flamingo](https://developer.android.com/studio/releases#android-studio-flamingo-|-2022.2.1-patch-2-may-2023) or higher).
21
+
Note that this tutorial was written using Android Studio Giraffe. Setup steps can vary slightly across Android Studio versions.
22
+
We recommend using Android Studio Giraffe or newer.
23
23
24
24
### Step 2 - Install the SDK & Setup the client
25
25
26
-
**Add the Video Compose SDK** and [Jetpack Compose](https://developer.android.com/jetpack/compose) dependencies to your app's `build.gradle.kts` file found in `app/build.gradle`.
26
+
**Add the Video Compose SDK** and [Jetpack Compose](https://developer.android.com/jetpack/compose) dependencies to your app's `build.gradle.kts` file found in `app/build.gradle.kts`.
27
27
If you're new to android, note that there are 2 `build.gradle` files, you want to open the `build.gradle` in the app folder.
Copy file name to clipboardExpand all lines: docusaurus/docs/Android/02-tutorials/02-audio-room.mdx
+3-4Lines changed: 3 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -20,17 +20,16 @@ Time to get started building an audio-room for your app.
20
20
21
21
### Step 1 - Create a new project in Android Studio
22
22
23
-
This tutorial was written using [Android Studio Flamingo](https://developer.android.com/studio/releases#android-studio-flamingo-|-2022.2.1-patch-2-may-2023).
24
-
Setup steps can vary slightly across Android Studio versions.
25
-
If you run into trouble, make sure to use the latest version of Android Studio.
23
+
Note that this tutorial was written using Android Studio Giraffe. Setup steps can vary slightly across Android Studio versions.
24
+
We recommend using Android Studio Giraffe or newer.
26
25
27
26
1. Create a new project
28
27
2. Select Phone & Tablet -> **Empty Activity**
29
28
3. Name your project **AudioRoom**.
30
29
31
30
### Step 2 - Install the SDK & Setup the client
32
31
33
-
**Add the Video Compose SDK** and [Jetpack Compose](https://developer.android.com/jetpack/compose) dependencies to your app's `build.gradle.kts` file found in `app/build.gradle`.
32
+
**Add the Video Compose SDK** and [Jetpack Compose](https://developer.android.com/jetpack/compose) dependencies to your app's `build.gradle.kts` file found in `app/build.gradle.kts`.
34
33
If you're new to android, note that there are 2 `build.gradle` files, you want to open the `build.gradle` in the app folder.
When you run the app now you'll see a text message saying: "TODO: render video".
109
+
Before we get around to rendering the video let's review the code above.
110
+
111
+
In the first step we setup the user:
114
112
115
113
```kotlin
116
114
val user =User(
@@ -119,15 +117,15 @@ val user = User(
119
117
)
120
118
```
121
119
122
-
If you don't have an authenticated user you can also use a guest or anonymous user. TODO DOCS
120
+
If you don't have an authenticated user you can also use a guest or anonymous user.
123
121
For most apps it's convenient to match your own system of users to grant and remove permissions.
124
122
125
123
Next we create the client:
126
124
127
125
```kotlin
128
126
val client =StreamVideoBuilder(
129
127
context = applicationContext,
130
-
apiKey ="hd8szvscpxvd", // demo API key
128
+
apiKey ="mmhfdzb5evj2", // demo API key
131
129
geo =GEO.GlobalEdgeNetwork,
132
130
user = user,
133
131
token = userToken,
@@ -136,7 +134,9 @@ val client = StreamVideoBuilder(
136
134
137
135
You'll see the `userToken` variable. Your backend typically generates the user token on signup or login.
138
136
139
-
In the next step we create and join the call. The call object is used for video calls, audio rooms and livestreaming.
137
+
The most important step to review is how we create the call.
138
+
Stream uses the same call object for livestreaming, audio rooms and video calling.
139
+
Have a look at the code snippet below:
140
140
141
141
```kotlin
142
142
val call = client.call("livestream", callId)
@@ -149,13 +149,20 @@ lifecycleScope.launch {
149
149
}
150
150
```
151
151
152
-
`call.join(create = true)` is the simplest example.
153
-
It's also possible to configure settings on the call or add co-hosts. TODO: Docs
152
+
First call object is created by specifying the call type: "livestream" and the callId.
153
+
The "livestream" call type is just a set a defaults that typically works well for a livestream.
154
+
You can edit the features, permissions and settings in the dashboard.
155
+
The dashboard also allows you to create new call types as needed.
156
+
157
+
Lastly, call.join(create = true) creates the call object on our servers and joins it.
158
+
The moment you use call.join() the realtime transport for audio and video is started.
154
159
155
-
### Step 4 - Render the video
160
+
Note that you can also add members to a call and assign them different roles. (See the [call creation docs](../03-guides/02-joining-creating-calls.mdx))
161
+
162
+
### Step 4 - Rendering the video
156
163
157
164
In this step we're going to build a UI for showing your local video with a button to start the livestream.
158
-
This example uses compose, but you could also use our XML VideoRenderer.
165
+
This example uses Compose, but you could also use our XML VideoRenderer.
159
166
160
167
In `MainActivity.kt` replace the `VideoTheme` with the following code.
161
168
@@ -244,12 +251,15 @@ If you now run your app you should see an interface like this:
244
251
245
252

246
253
247
-
When you press **go live** your video will be transmitted.
248
-
Press go live in the android app and click the link below to watch it in your browser.
254
+
Stream uses a technology called SFU cascading to replicate your livestream over different SFUs around the world.
255
+
This makes it possible to reach a large audience in realtime.
256
+
257
+
Now let's press "go live" in the android app and click the link below to watch the video in your browser.
Let's take a moment to review the compose code above `Call.state` exposes all the stateflow objects you need.
261
+
Let's take a moment to review the Compose code above. Call.state exposes all the stateflow objects you need.
262
+
253
263
The most important ones are:
254
264
255
265
```
@@ -259,82 +269,74 @@ call.state.participants
259
269
260
270
The [participant state docs](../03-guides/03-call-and-participant-state.mdx) show all the available data.
261
271
262
-
The compose layout is vanilla compose other than **[VideoRenderer](../04-ui-components/02-video-renderer.mdx)**.
263
-
`VideoRenderer` renders the video and a fallback. You can use it for both rendering the local video and remote video.
272
+
The livestream layout is built using standard Compose. The [VideoRenderer](../04-ui-components/02-video-renderer.mdx) component is provided by Stream.
273
+
VideoRenderer renders the video and a fallback. You can use it for rendering the local and remote video.
264
274
265
275
### Step 4 - (Optional) Publishing RTMP using OBS
266
276
267
277
The example above showed how to publish your phone's camera to the livestream.
268
278
Almost all livestream software and hardware supports RTMPs.
269
279
So let's see how to publish using RTMPs. Feel free to skip this step if you don't need to use RTMPs.
270
280
271
-
A. Console log the URL & Stream Key println(call.state.ingress.rtmp)
272
-
273
-
B. Open OBS and go to settings -> stream
274
-
- Select "custom" service
275
-
- Server: equal to the server URL from the console log
276
-
- Stream key: equal to the stream key from the console log
277
-
278
-
Press start streaming. The RTMP stream will now show up in your call.
279
-
Now that we've learned to publish using webrtc or RTMP let's talk about viewing the livestream.
281
+
A. Log the URL & Stream Key
280
282
281
-
### Step 5 - Viewing a livestream (Webrtc)
283
+
```kotlin
284
+
val rtmp = call.state.ingress.rtmp
285
+
Log.i("Tutorial", "RTMP url and streamingKey: $rtmp")
286
+
```
282
287
283
-
Watching a livestream is basically a simplified version of the code we wrote in `MainActivity.kt`
288
+
B. Open OBS and go to settings -> stream
284
289
285
-
* You don't need to request permissions or enable the camera
286
-
*
290
+
- Select "custom" service
291
+
- Server: equal to the server URL from the log
292
+
- Stream key: equal to the stream key from the log
287
293
288
-
To change the example above and just watch a livestream:
294
+
Press start streaming in OBS. The RTMP stream will now show up in your call just like a regular video participant.
295
+
Now that we've learned to publish using WebRTC or RTMP let's talk about watching the livestream.
289
296
290
-
```kotlin
291
-
// remove this
292
-
call.camera.enable()
293
-
call.microphone.enable()
297
+
### Step 5 - Viewing a livestream (WebRTC)
294
298
295
-
// and this
296
-
LaunchCallPermissions(call = call)
297
-
// on the UI side remove the button to go live
299
+
Watching a livestream is even easier than broadcasting.
298
300
299
-
// and update this to use the remote video
300
-
val me by call.state.me.collectAsState()
301
-
val video = me?.video?.collectAsState()
302
-
```
301
+
Compared to the current code in in `MainActivity.kt` you:
303
302
304
-
Here's the update MainActivity for viewing a call
303
+
* Don't need to request permissions or enable the camera
304
+
* Don't render the local video, but instead render the remote video
305
+
* Typically include some small UI elements like viewer count, a button to mute etc
305
306
306
-
```kotlin
307
-
```
307
+
The [docs on building a UI for watching a livestream](../05-ui-cookbook/15-watching-livestream.mdx) explain this in more detail.
308
308
309
309
### Step 6 - (Optional) Viewing a livestream with HLS
310
310
311
-
Another way to view a livestream is using HLS. HLS tends to have a 10 to 20 seconds delay, while the above webrtc approach only has a 100-200ms delay typically.
312
-
The benefit that HLS has is that it buffers better under poor network conditions.
313
-
So for apps where you expect your users to have poor network, and where a 10 second delay is ok, HLS can be a better option.
311
+
Another way to watch a livestream is using HLS. HLS tends to have a 10 to 20 seconds delay, while the above WebRTC approach is realtime.
312
+
The benefit that HLS offers is better buffering under poor network conditions.
313
+
So HLS can be a good option when:
314
314
315
-
Let's show how to broadcast your call to HLS.
315
+
* A 10-20 second delay is acceptable
316
+
* Your users want to watch the Stream in poor network conditions
316
317
317
-
```kotlin
318
-
call.startBroadcast()
319
-
```
320
-
321
-
After starting the broadcast the HLS url can be found in the call state
318
+
Let's show how to broadcast your call to HLS:
322
319
323
320
```kotlin
324
-
call.state.egress.value?.hls
321
+
call.startBroadcast()
322
+
val hlsUrl = call.state.egress.value?.hls
323
+
Log.i("Tutorial", "HLS url = $hlsUrl")
325
324
```
326
325
327
-
You can view the HLS video feed using any open source HLS capable video player.
326
+
You can view the HLS video feed using any HLS capable video player.
328
327
329
328
### 7 - Advanced Features
330
329
330
+
This tutorial covered broadcasting and watching a livestream.
331
+
It also went into more details about HLS & RTMP-in.
332
+
331
333
There are several advanced features that can improve the livestreaming experience:
332
334
333
-
*** Co-hosts ** You can add members to your livestream with elevated permissions. So you can have co-hosts, moderators etc.
334
-
*** Custom events ** You can use custom events on the call to share any additional data. Think about showing the score for a game, or any other realtime use case.
335
-
*** Reactions & Chat ** Users can react to the livestream, and you can add chat. This makes for a more engaging experience.
336
-
*** Notifications ** You can notify users via push notifications when the livestream starts
337
-
*** Recording ** The call recording functionality allows you to record the call with various options and layouts
335
+
***[Co-hosts](../03-guides/02-joining-creating-calls.mdx)** You can add members to your livestream with elevated permissions. So you can have co-hosts, moderators etc.
336
+
***[Custom events](../03-guides/08-reactions-and-custom-events.mdx)** You can use custom events on the call to share any additional data. Think about showing the score for a game, or any other realtime use case.
337
+
***[Reactions & Chat](../03-guides/08-reactions-and-custom-events.mdx)** Users can react to the livestream, and you can add chat. This makes for a more engaging experience.
338
+
***[Notifications](../06-advanced/01-ringing.mdx)** You can notify users via push notifications when the livestream starts
339
+
***[Recording](../06-advanced/06-recording.mdx)** The call recording functionality allows you to record the call with various options and layouts
338
340
339
341
### Recap
340
342
@@ -344,7 +346,7 @@ Our team is also happy to review your UI designs and offer recommendations on ho
344
346
345
347
To recap what we've learned:
346
348
347
-
*Webrtc is optimal for latency, HLS is slower but buffers better for users with poor connections
349
+
*WebRTC is optimal for latency, HLS is slower but buffers better for users with poor connections
348
350
* You setup a call: (val call = client.call("livestream", callId))
349
351
* The call type "livestream" controls which features are enabled and how permissions are setup
350
352
* The livestream by default enables "backstage" mode. This allows you and your co-hosts to setup your mic and camera before allowing people in
0 commit comments