Skip to content

Commit da0219b

Browse files
committed
Merge branch 'develop'
2 parents fc708bf + 50f90bd commit da0219b

File tree

7 files changed

+53
-43
lines changed

7 files changed

+53
-43
lines changed

buildSrc/src/main/kotlin/io/getstream/video/android/Configuration.kt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,9 +6,9 @@ object Configuration {
66
const val minSdk = 24
77
const val majorVersion = 0
88
const val minorVersion = 2
9-
const val patchVersion = 0
9+
const val patchVersion = 1
1010
const val versionName = "$majorVersion.$minorVersion.$patchVersion"
11-
const val versionCode = 1
11+
const val versionCode = 3
1212
const val snapshotVersionName = "$majorVersion.$minorVersion.${patchVersion}-SNAPSHOT"
1313
const val artifactGroup = "io.getstream"
1414
}

docusaurus/docs/Android/02-tutorials/01-video-calling.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ If you're new to android, note that there are 2 `build.gradle` files, you want t
3131
```kotlin
3232
dependencies {
3333
// Stream Video Compose SDK
34-
implementation("io.getstream:stream-video-android-compose:0.2.0")
34+
implementation("io.getstream:stream-video-android-compose:0.2.1")
3535

3636
// Optionally add Jetpack Compose if Android studio didn't automatically include them
3737
implementation(platform("androidx.compose:compose-bom:2023.06.00"))

docusaurus/docs/Android/02-tutorials/02-audio-room.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,7 @@ If you're new to android, note that there are 2 `build.gradle` files, you want t
3535
```groovy
3636
dependencies {
3737
// Stream Video Compose SDK
38-
implementation("io.getstream:stream-video-android-compose:0.2.0")
38+
implementation("io.getstream:stream-video-android-compose:0.2.1")
3939
4040
// Jetpack Compose (optional/ android studio typically adds them when you create a new project)
4141
implementation(platform("androidx.compose:compose-bom:2023.06.00"))

docusaurus/docs/Android/02-tutorials/03-livestream.mdx

Lines changed: 34 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -20,27 +20,40 @@ Let's get started, if you have any questions or feedback be sure to let us know
2020

2121
### Step 1 - Create a new project in Android Studio
2222

23-
Note that this tutorial was written using Android Studio Giraffe. Setup steps can vary slightly across Android Studio versions.
24-
We recommend using Android Studio Giraffe or newer.
23+
Note that this tutorial was written using **Android Studio Giraffe**. Setup steps can vary slightly across Android Studio versions.
24+
We recommend using [Android Studio Giraffe or newer](https://developer.android.com/studio/releases).
2525

2626
1. Create a new project
27-
2. Select Phone & Template -> **empty activity**
27+
2. Select Phone & Tablet -> **Empty Activity**
2828
3. Name your project **Livestream**.
2929

3030
### Step 2 - Install the SDK & Setup the client
3131

32-
**Add the video SDK** to your app's `build.gradle.kts` file found in `app/build.gradle.kts`.
33-
If you're new to android note that there are 2 build.gradle files, you want to open the one in the app folder.
32+
**Add the Video Compose SDK** and [Jetpack Compose](https://developer.android.com/jetpack/compose) dependencies to your app's `build.gradle.kts` file found in `app/build.gradle.kts`.
33+
If you're new to android, note that there are 2 `build.gradle` files, you want to open the `build.gradle` in the app folder.
3434

3535
```kotlin
3636
dependencies {
37-
implementation("io.getstream:stream-video-android-compose:0.2.0")
38-
39-
...
37+
// Stream Video Compose SDK
38+
implementation("io.getstream:stream-video-android-compose:0.2.1")
39+
40+
// Jetpack Compose (optional/ android studio typically adds them when you create a new project)
41+
implementation(platform("androidx.compose:compose-bom:2023.06.00"))
42+
implementation("androidx.activity:activity-compose:1.7.2")
43+
implementation("androidx.compose.ui:ui")
44+
implementation("androidx.compose.ui:ui-tooling")
45+
implementation("androidx.compose.runtime:runtime")
46+
implementation("androidx.compose.foundation:foundation")
47+
implementation("com.google.android.material:material")
4048
}
4149
```
4250

43-
This tutorial uses the compose version of the video SDK. Stream also provides a core library without compose.
51+
There are 2 versions of Stream's SDK.
52+
53+
- **Video Compose SDK**: `io.getstream:stream-video-android-compose` dependency that includes the video core SDK + compose UI components.
54+
- **Video Core SDK**: `io.getstream:stream-video-android-core` that only includes the core parts of the video SDK.
55+
56+
This tutorial demonstrates the Compose Video SDK, but you have the option to use the core library without Compose based on your preference.
4457

4558
### Step 3 - Broadcast a livestream from your phone
4659

@@ -149,22 +162,18 @@ lifecycleScope.launch {
149162
}
150163
```
151164

152-
First call object is created by specifying the call type: "livestream" and the callId.
153-
The "livestream" call type is just a set a defaults that typically works well for a livestream.
154-
You can edit the features, permissions and settings in the dashboard.
155-
The dashboard also allows you to create new call types as needed.
165+
To create the first call object, specify the call type as **livestream** and provide a unique **callId**. The **livestream** call type comes with default settings that are usually suitable for livestreams, but you can customize features, permissions, and settings in the dashboard. Additionally, the dashboard allows you to create new call types as required.
156166

157-
Lastly, call.join(create = true) creates the call object on our servers and joins it.
158-
The moment you use call.join() the realtime transport for audio and video is started.
167+
Finally, using `call.join(create = true)` will not only create the call object on our servers but also initiate the real-time transport for audio and video. This allows for seamless and immediate engagement in the livestream.
159168

160-
Note that you can also add members to a call and assign them different roles. (See the [call creation docs](../03-guides/02-joining-creating-calls.mdx))
169+
Note that you can also add members to a call and assign them different roles. For more information, see the [call creation docs](../03-guides/02-joining-creating-calls.mdx)
161170

162171
### Step 4 - Rendering the video
163172

164173
In this step we're going to build a UI for showing your local video with a button to start the livestream.
165174
This example uses Compose, but you could also use our XML VideoRenderer.
166175

167-
In `MainActivity.kt` replace the `VideoTheme` with the following code.
176+
In `MainActivity.kt` replace the `VideoTheme` with the following code:
168177

169178
```kotlin
170179
VideoTheme {
@@ -183,7 +192,7 @@ VideoTheme {
183192
.background(VideoTheme.colors.appBackground)
184193
.padding(6.dp),
185194
contentColor = VideoTheme.colors.appBackground,
186-
backgroundColor = VideoTheme.colors.appBackground,
195+
187196
topBar = {
188197
if (connection == RealtimeConnection.Connected) {
189198
if (!backstage) {
@@ -247,18 +256,18 @@ VideoTheme {
247256
}
248257
```
249258

250-
If you now run your app you should see an interface like this:
259+
Upon running your app, you will be greeted with an interface that looks like this:
251260

252261
![Livestream](../assets/tutorial-livestream.png)
253262

254263
Stream uses a technology called SFU cascading to replicate your livestream over different SFUs around the world.
255264
This makes it possible to reach a large audience in realtime.
256265

257-
Now let's press "go live" in the android app and click the link below to watch the video in your browser.
266+
Now let's press **Go live** in the android app and click the link below to watch the video in your browser.
258267

259268
<TokenSnippet sampleApp='livestream' displayStyle='join' />
260269

261-
Let's take a moment to review the Compose code above. Call.state exposes all the stateflow objects you need.
270+
Let's take a moment to review the Compose code above. `Call.state` exposes all the stateflow objects you need.
262271

263272
The most important ones are:
264273

@@ -269,8 +278,10 @@ call.state.participants
269278

270279
The [participant state docs](../03-guides/03-call-and-participant-state.mdx) show all the available data.
271280

272-
The livestream layout is built using standard Compose. The [VideoRenderer](../04-ui-components/02-video-renderer.mdx) component is provided by Stream.
273-
VideoRenderer renders the video and a fallback. You can use it for rendering the local and remote video.
281+
The livestream layout is built using standard Jetpack Compose. The [VideoRenderer](../04-ui-components/02-video-renderer.mdx) component is provided by Stream.
282+
**VideoRenderer** renders the video and a fallback. You can use it for rendering the local and remote video.
283+
284+
If you want to learn more about building an advanced UIs for watching a livestream, check out [Cookbook: Watching a livestream](../05-ui-cookbook/15-watching-livestream.mdx).
274285

275286
### Step 4 - (Optional) Publishing RTMP using OBS
276287

@@ -304,8 +315,6 @@ Compared to the current code in in `MainActivity.kt` you:
304315
* Don't render the local video, but instead render the remote video
305316
* Typically include some small UI elements like viewer count, a button to mute etc
306317

307-
The [docs on building a UI for watching a livestream](../05-ui-cookbook/15-watching-livestream.mdx) explain this in more detail.
308-
309318
### Step 6 - (Optional) Viewing a livestream with HLS
310319

311320
Another way to watch a livestream is using HLS. HLS tends to have a 10 to 20 seconds delay, while the above WebRTC approach is realtime.

docusaurus/docs/Android/05-ui-cookbook/15-watching-livestream.mdx

Lines changed: 14 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -3,10 +3,10 @@ title: Watching a livestream
33
description: How to watch a livestream on android with Kotlin
44
---
55

6-
This cookbook tutorial walks you through how to render the UI for watching a livestream on Android.
6+
This cookbook tutorial walks you through how to build an advanced UIs for watching a livestream on Android.
77

88
:::note
9-
This cookbook tutorial will skip how to join a livestream call. If you didn't learn about the [Livestream Tutorial](../02-tutorials/03-livestream.mdx), we highly recommend you to read about the tutorial first.
9+
n this cookbook tutorial, we will assume that you already know how to join a livestream call. If you haven't familiarized yourself with the [Livestream Tutorial](../02-tutorials/03-livestream.mdx) yet, we highly recommend doing so before proceeding with this cookbook.
1010
:::
1111

1212
When you build a livestreaming UI, there are a few things to keep in mind:
@@ -27,9 +27,9 @@ In this cookbook tutorial, you'll learn how to build the result below at the end
2727

2828
### Rendering Livestreaming
2929

30-
First thing first, you need to render a livestreaming video, which is the most important part in the screen.
30+
First and foremost, rendering the livestreaming video is the key feature and the most crucial part of the screen.
3131

32-
You can simply render your livestreaming video like the sample below:
32+
To accomplish this, you can easily render your livestreaming video using the following simple sample code:
3333

3434
```kotlin
3535
val userToken = "REPLACE_WITH_TOKEN"
@@ -91,9 +91,9 @@ If you run the above example, you'll see the very basic video streaming screen b
9191

9292
### Implement Live Participants Label
9393

94-
Now you need to build labels that indicates the count of participants of your livestreaming, and streaming time.
94+
Now you need to build labels that display the count of participants in your livestreaming session and indicate the streaming time.
9595

96-
You can simply implement the live label like so:
96+
You can easily implement the live label using the following approach:
9797

9898
```kotlin
9999
@Composable
@@ -133,13 +133,13 @@ fun LiveLabel(
133133
}
134134
```
135135

136-
If you build a preview for `LiveLabel` Composable, you'll see the result below:
136+
Upon building a preview for the `LiveLabel` Composable, you will observe the following result:
137137

138138
![LiveLabel](../assets/cookbook/livestream-live-label.png)
139139

140140
### Implement Live Time Label
141141

142-
Next, you need to implement the live time label, which indicates the time duration once your start live streaming.
142+
Next, you need to implement the live time label, which will display the duration of the livestream once it starts.
143143

144144
You can simply implement the live time label like so:
145145

@@ -239,7 +239,7 @@ VideoTheme {
239239
}
240240
```
241241

242-
As you can see above example, you can see some of state declaration from the call state:
242+
As demonstrated in the example above, you can observe several state declarations representing the call state.:
243243

244244
- `participantCount`: A model that contains information about participant counts.
245245
- `connection`: Indicates the connection state of a call.
@@ -250,7 +250,7 @@ As you can see above example, you can see some of state declaration from the cal
250250

251251
### Implement Live Button
252252

253-
Now let's build a live button that allows you to start/stop broadcast your call, and controls your physical device, such as camera and microphone.
253+
Let's proceed with building a live button that enables you to start/stop broadcasting your call and control your physical device, including the camera and microphone.
254254

255255
You can implement the live button like so:
256256

@@ -353,8 +353,10 @@ Scaffold(
353353
}
354354
```
355355

356-
After building your project, you'll see the final result below:
356+
Once you've completed building your project, you'll witness the final result as depicted below:
357357

358358
![LiveStream Backstage](../assets/cookbook/livestream-backstage.png)
359359

360-
You can broadcast your stream by clicking the **Go Live** button.
360+
By simply clicking the **Go Live** button, you can begin broadcasting your stream.
361+
362+
In this cookbook tutorial, you have learned how to create an advanced live streaming screen. If you wish to refer to the code, feel free to explore the [GitHub Repository](https://github.com/GetStream/stream-video-android/tree/develop/tutorials/tutorial-livestream).

docusaurus/docs/Android/06-advanced/04-chat-with-video.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ Let the project sync. It should have all the dependencies required for you to fi
3131
```groovy
3232
dependencies {
3333
// Stream Video Compose SDK
34-
implementation("io.getstream:stream-video-android-compose:0.2.0")
34+
implementation("io.getstream:stream-video-android-compose:0.2.1")
3535
3636
// Stream Chat
3737
implementation(libs.stream.chat.compose)

stream-video-android-compose/src/main/kotlin/io/getstream/video/android/compose/ui/components/audio/AudioRoomContent.kt

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -109,7 +109,6 @@ public fun AudioRoomContent(
109109
.background(VideoTheme.colors.appBackground)
110110
.padding(32.dp),
111111
contentColor = VideoTheme.colors.appBackground,
112-
backgroundColor = VideoTheme.colors.appBackground,
113112
topBar = {
114113
if (isShowingAppBar) {
115114
appBarContent.invoke(call)

0 commit comments

Comments
 (0)