You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docusaurus/docs/Android/02-tutorials/03-livestream.mdx
+34-25Lines changed: 34 additions & 25 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -20,27 +20,40 @@ Let's get started, if you have any questions or feedback be sure to let us know
20
20
21
21
### Step 1 - Create a new project in Android Studio
22
22
23
-
Note that this tutorial was written using Android Studio Giraffe. Setup steps can vary slightly across Android Studio versions.
24
-
We recommend using Android Studio Giraffe or newer.
23
+
Note that this tutorial was written using **Android Studio Giraffe**. Setup steps can vary slightly across Android Studio versions.
24
+
We recommend using [Android Studio Giraffe or newer](https://developer.android.com/studio/releases).
25
25
26
26
1. Create a new project
27
-
2. Select Phone & Template -> **empty activity**
27
+
2. Select Phone & Tablet -> **Empty Activity**
28
28
3. Name your project **Livestream**.
29
29
30
30
### Step 2 - Install the SDK & Setup the client
31
31
32
-
**Add the video SDK** to your app's `build.gradle.kts` file found in `app/build.gradle.kts`.
33
-
If you're new to android note that there are 2 build.gradle files, you want to open the one in the app folder.
32
+
**Add the Video Compose SDK** and [Jetpack Compose](https://developer.android.com/jetpack/compose) dependencies to your app's `build.gradle.kts` file found in `app/build.gradle.kts`.
33
+
If you're new to android, note that there are 2 `build.gradle` files, you want to open the `build.gradle` in the app folder.
This tutorial uses the compose version of the video SDK. Stream also provides a core library without compose.
51
+
There are 2 versions of Stream's SDK.
52
+
53
+
-**Video Compose SDK**: `io.getstream:stream-video-android-compose` dependency that includes the video core SDK + compose UI components.
54
+
-**Video Core SDK**: `io.getstream:stream-video-android-core` that only includes the core parts of the video SDK.
55
+
56
+
This tutorial demonstrates the Compose Video SDK, but you have the option to use the core library without Compose based on your preference.
44
57
45
58
### Step 3 - Broadcast a livestream from your phone
46
59
@@ -149,22 +162,18 @@ lifecycleScope.launch {
149
162
}
150
163
```
151
164
152
-
First call object is created by specifying the call type: "livestream" and the callId.
153
-
The "livestream" call type is just a set a defaults that typically works well for a livestream.
154
-
You can edit the features, permissions and settings in the dashboard.
155
-
The dashboard also allows you to create new call types as needed.
165
+
To create the first call object, specify the call type as **livestream** and provide a unique **callId**. The **livestream** call type comes with default settings that are usually suitable for livestreams, but you can customize features, permissions, and settings in the dashboard. Additionally, the dashboard allows you to create new call types as required.
156
166
157
-
Lastly, call.join(create = true) creates the call object on our servers and joins it.
158
-
The moment you use call.join() the realtime transport for audio and video is started.
167
+
Finally, using `call.join(create = true)` will not only create the call object on our servers but also initiate the real-time transport for audio and video. This allows for seamless and immediate engagement in the livestream.
159
168
160
-
Note that you can also add members to a call and assign them different roles. (See the [call creation docs](../03-guides/02-joining-creating-calls.mdx))
169
+
Note that you can also add members to a call and assign them different roles. For more information, see the [call creation docs](../03-guides/02-joining-creating-calls.mdx)
161
170
162
171
### Step 4 - Rendering the video
163
172
164
173
In this step we're going to build a UI for showing your local video with a button to start the livestream.
165
174
This example uses Compose, but you could also use our XML VideoRenderer.
166
175
167
-
In `MainActivity.kt` replace the `VideoTheme` with the following code.
176
+
In `MainActivity.kt` replace the `VideoTheme` with the following code:
168
177
169
178
```kotlin
170
179
VideoTheme {
@@ -183,7 +192,7 @@ VideoTheme {
183
192
.background(VideoTheme.colors.appBackground)
184
193
.padding(6.dp),
185
194
contentColor =VideoTheme.colors.appBackground,
186
-
backgroundColor =VideoTheme.colors.appBackground,
195
+
187
196
topBar = {
188
197
if (connection ==RealtimeConnection.Connected) {
189
198
if (!backstage) {
@@ -247,18 +256,18 @@ VideoTheme {
247
256
}
248
257
```
249
258
250
-
If you now run your app you should see an interface like this:
259
+
Upon running your app, you will be greeted with an interface that looks like this:
251
260
252
261

253
262
254
263
Stream uses a technology called SFU cascading to replicate your livestream over different SFUs around the world.
255
264
This makes it possible to reach a large audience in realtime.
256
265
257
-
Now let's press "go live" in the android app and click the link below to watch the video in your browser.
266
+
Now let's press **Go live** in the android app and click the link below to watch the video in your browser.
Let's take a moment to review the Compose code above. Call.state exposes all the stateflow objects you need.
270
+
Let's take a moment to review the Compose code above. `Call.state` exposes all the stateflow objects you need.
262
271
263
272
The most important ones are:
264
273
@@ -269,8 +278,10 @@ call.state.participants
269
278
270
279
The [participant state docs](../03-guides/03-call-and-participant-state.mdx) show all the available data.
271
280
272
-
The livestream layout is built using standard Compose. The [VideoRenderer](../04-ui-components/02-video-renderer.mdx) component is provided by Stream.
273
-
VideoRenderer renders the video and a fallback. You can use it for rendering the local and remote video.
281
+
The livestream layout is built using standard Jetpack Compose. The [VideoRenderer](../04-ui-components/02-video-renderer.mdx) component is provided by Stream.
282
+
**VideoRenderer** renders the video and a fallback. You can use it for rendering the local and remote video.
283
+
284
+
If you want to learn more about building an advanced UIs for watching a livestream, check out [Cookbook: Watching a livestream](../05-ui-cookbook/15-watching-livestream.mdx).
274
285
275
286
### Step 4 - (Optional) Publishing RTMP using OBS
276
287
@@ -304,8 +315,6 @@ Compared to the current code in in `MainActivity.kt` you:
304
315
* Don't render the local video, but instead render the remote video
305
316
* Typically include some small UI elements like viewer count, a button to mute etc
306
317
307
-
The [docs on building a UI for watching a livestream](../05-ui-cookbook/15-watching-livestream.mdx) explain this in more detail.
308
-
309
318
### Step 6 - (Optional) Viewing a livestream with HLS
310
319
311
320
Another way to watch a livestream is using HLS. HLS tends to have a 10 to 20 seconds delay, while the above WebRTC approach is realtime.
Copy file name to clipboardExpand all lines: docusaurus/docs/Android/05-ui-cookbook/15-watching-livestream.mdx
+14-12Lines changed: 14 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,10 +3,10 @@ title: Watching a livestream
3
3
description: How to watch a livestream on android with Kotlin
4
4
---
5
5
6
-
This cookbook tutorial walks you through how to render the UI for watching a livestream on Android.
6
+
This cookbook tutorial walks you through how to build an advanced UIs for watching a livestream on Android.
7
7
8
8
:::note
9
-
This cookbook tutorialwill skip how to join a livestream call. If you didn't learn about the [Livestream Tutorial](../02-tutorials/03-livestream.mdx), we highly recommend you to read about the tutorial first.
9
+
n this cookbook tutorial, we will assume that you already know how to join a livestream call. If you haven't familiarized yourself with the [Livestream Tutorial](../02-tutorials/03-livestream.mdx) yet, we highly recommend doing so before proceeding with this cookbook.
10
10
:::
11
11
12
12
When you build a livestreaming UI, there are a few things to keep in mind:
@@ -27,9 +27,9 @@ In this cookbook tutorial, you'll learn how to build the result below at the end
27
27
28
28
### Rendering Livestreaming
29
29
30
-
First thing first, you need to render a livestreaming video, which is the most important part in the screen.
30
+
First and foremost, rendering the livestreaming video is the key feature and the most crucial part of the screen.
31
31
32
-
You can simply render your livestreaming video like the sample below:
32
+
To accomplish this, you can easily render your livestreaming video using the following simple sample code:
33
33
34
34
```kotlin
35
35
val userToken ="REPLACE_WITH_TOKEN"
@@ -91,9 +91,9 @@ If you run the above example, you'll see the very basic video streaming screen b
91
91
92
92
### Implement Live Participants Label
93
93
94
-
Now you need to build labels that indicates the count of participants of your livestreaming, and streaming time.
94
+
Now you need to build labels that display the count of participants in your livestreaming session and indicate the streaming time.
95
95
96
-
You can simply implement the live label like so:
96
+
You can easily implement the live label using the following approach:
97
97
98
98
```kotlin
99
99
@Composable
@@ -133,13 +133,13 @@ fun LiveLabel(
133
133
}
134
134
```
135
135
136
-
If you build a preview for `LiveLabel` Composable, you'll see the result below:
136
+
Upon building a preview for the `LiveLabel` Composable, you will observe the following result:
Next, you need to implement the live time label, which indicates the time duration once your start live streaming.
142
+
Next, you need to implement the live time label, which will display the duration of the livestream once it starts.
143
143
144
144
You can simply implement the live time label like so:
145
145
@@ -239,7 +239,7 @@ VideoTheme {
239
239
}
240
240
```
241
241
242
-
As you can see above example, you can see some of state declaration from the call state:
242
+
As demonstrated in the example above, you can observe several state declarations representing the call state.:
243
243
244
244
-`participantCount`: A model that contains information about participant counts.
245
245
-`connection`: Indicates the connection state of a call.
@@ -250,7 +250,7 @@ As you can see above example, you can see some of state declaration from the cal
250
250
251
251
### Implement Live Button
252
252
253
-
Now let's build a live button that allows you to start/stop broadcast your call, and controls your physical device, such as camera and microphone.
253
+
Let's proceed with building a live button that enables you to start/stop broadcasting your call and control your physical device, including the camera and microphone.
254
254
255
255
You can implement the live button like so:
256
256
@@ -353,8 +353,10 @@ Scaffold(
353
353
}
354
354
```
355
355
356
-
After building your project, you'll see the final result below:
356
+
Once you've completed building your project, you'll witness the final result as depicted below:
You can broadcast your stream by clicking the **Go Live** button.
360
+
By simply clicking the **Go Live** button, you can begin broadcasting your stream.
361
+
362
+
In this cookbook tutorial, you have learned how to create an advanced live streaming screen. If you wish to refer to the code, feel free to explore the [GitHub Repository](https://github.com/GetStream/stream-video-android/tree/develop/tutorials/tutorial-livestream).
Copy file name to clipboardExpand all lines: stream-video-android-compose/src/main/kotlin/io/getstream/video/android/compose/ui/components/audio/AudioRoomContent.kt
0 commit comments