Skip to content

Commit 57808b1

Browse files
committed
Android Selfie App LP review
1 parent 8147bfb commit 57808b1

File tree

4 files changed

+27
-50
lines changed

4 files changed

+27
-50
lines changed

content/learning-paths/smartphones-and-mobile/build-android-selfie-app-using-mediapipe-multimodality/6-flow-data-to-view-1.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ layout: learningpathall
88

99
[SharedFlow](https://developer.android.com/kotlin/flow/stateflow-and-sharedflow#sharedflow) and [StateFlow](https://developer.android.com/kotlin/flow/stateflow-and-sharedflow#stateflow) are [Kotlin Flow](https://developer.android.com/kotlin/flow) APIs that enable Flows to optimally emit state updates and emit values to multiple consumers.
1010

11-
In this learning path, you will have the opportunity to experiment with both `SharedFlow` and `StateFlow`. This chapter will focus on SharedFlow while the next chapter will focus on StateFlow.
11+
In this learning path, you will experiment with both `SharedFlow` and `StateFlow`. This section will focus on SharedFlow while the next chapter will focus on StateFlow.
1212

1313
`SharedFlow` is a general-purpose, hot flow that can emit values to multiple subscribers. It is highly configurable, allowing you to set the replay cache size, buffer capacity, etc.
1414

@@ -54,9 +54,9 @@ This `SharedFlow` is initialized with a replay size of `1`. This retains the mos
5454

5555
## Visualize face and gesture results
5656

57-
To visualize the results of Face Landmark Detection and Gesture Recognition tasks, we have prepared the following code for you based on [MediaPipe's samples](https://github.com/google-ai-edge/mediapipe-samples/tree/main/examples).
57+
To visualize the results of Face Landmark Detection and Gesture Recognition tasks, based on [MediaPipe's samples](https://github.com/google-ai-edge/mediapipe-samples/tree/main/examples) follow the intructions in this section.
5858

59-
1. Create a new file named `FaceLandmarkerOverlayView.kt` and fill in the content below:
59+
1. Create a new file named `FaceLandmarkerOverlayView.kt` and copy the content below:
6060

6161
```kotlin
6262
/*
@@ -180,7 +180,7 @@ class FaceLandmarkerOverlayView(context: Context?, attrs: AttributeSet?) :
180180
```
181181

182182

183-
2. Create a new file named `GestureOverlayView.kt` and fill in the content below:
183+
2. Create a new file named `GestureOverlayView.kt` and copy the content below:
184184

185185
```kotlin
186186
/*
@@ -302,7 +302,7 @@ class GestureOverlayView(context: Context?, attrs: AttributeSet?) :
302302

303303
## Update UI in the view controller
304304

305-
1. Add the above two overlay views to `activity_main.xml` layout file:
305+
1. Add the two overlay views to `activity_main.xml` layout file:
306306

307307
```xml
308308
<com.example.holisticselfiedemo.FaceLandmarkerOverlayView
@@ -316,7 +316,7 @@ class GestureOverlayView(context: Context?, attrs: AttributeSet?) :
316316
android:layout_height="match_parent" />
317317
```
318318

319-
2. Collect the new SharedFlow `uiEvents` in `MainActivity` by appending the code below to the end of `onCreate` method, **below** `setupCamera()` method call.
319+
2. Collect the new SharedFlow `uiEvents` in `MainActivity` by appending the code below to the end of `onCreate` method, below `setupCamera()` method call.
320320

321321
```kotlin
322322
lifecycleScope.launch {
@@ -363,7 +363,7 @@ class GestureOverlayView(context: Context?, attrs: AttributeSet?) :
363363
}
364364
```
365365

366-
4. Build and run the app again. Now you should be seeing face and gesture overlays on top of the camera preview as shown below. Good job!
366+
4. Build and run the app again. Now you should see face and gesture overlays on top of the camera preview as shown below. Good job!
367367

368368
![overlay views](images/6/overlay%20views.png)
369369

content/learning-paths/smartphones-and-mobile/build-android-selfie-app-using-mediapipe-multimodality/7-flow-data-to-view-2.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ Therefore, `StateFlow` is a specialized type of `SharedFlow` that represents a s
2525
val gestureOk: StateFlow<Boolean> = _gestureOk
2626
```
2727

28-
2. Append the following constant values to `MainViewModel`'s companion object. In this demo app, we are only focusing on smiling faces and thumb-up gestures.
28+
2. Append the following constant values to `MainViewModel`'s companion object. In this demo app, you will focus on smiling faces and thumb-up gestures.
2929

3030
```kotlin
3131
private const val FACE_CATEGORY_MOUTH_SMILE = "mouthSmile"
@@ -75,7 +75,7 @@ Therefore, `StateFlow` is a specialized type of `SharedFlow` that represents a s
7575
<string name="condition_indicator_text_gesture">Gesture</string>
7676
```
7777

78-
2. In the same directory, create a new resource file named `dimens.xml` if not exist, which is used to define layout related dimension values:
78+
2. In the same directory, create a new resource file named `dimens.xml` if it does not exist. This file is used to define layout related dimension values:
7979

8080
```xml
8181
<?xml version="1.0" encoding="utf-8"?>
@@ -85,7 +85,7 @@ Therefore, `StateFlow` is a specialized type of `SharedFlow` that represents a s
8585
</resources>
8686
```
8787

88-
3. Navigate to `activity_main.xml` layout file and add the following code to the root `ConstraintLayout`, **below** the two overlay views which you just added in the previous chapter.
88+
3. Navigate to `activity_main.xml` layout file and add the following code to the root `ConstraintLayout`. Add this code after the two overlay views which you just added in the previous section.
8989

9090
```xml
9191
<androidx.appcompat.widget.SwitchCompat
@@ -111,7 +111,7 @@ Therefore, `StateFlow` is a specialized type of `SharedFlow` that represents a s
111111
app:layout_constraintBottom_toBottomOf="parent" />
112112
```
113113

114-
4. Finally, navigate to `MainActivity.kt` and append the following code inside `repeatOnLifecycle(Lifecycle.State.RESUMED)` block, **below** the `launch` block you just added in the previous chapter. This makes sure each of the **three** parallel `launch` runs in its own Coroutine concurrently without blocking each other.
114+
4. Finally, navigate to `MainActivity.kt` and append the following code inside `repeatOnLifecycle(Lifecycle.State.RESUMED)` block, after the `launch` block you just added in the previous section. This makes sure each of the three parallel `launch` run in its own co-routine concurrently without blocking each other.
115115

116116
```kotlin
117117
launch {
@@ -127,15 +127,15 @@ Therefore, `StateFlow` is a specialized type of `SharedFlow` that represents a s
127127
}
128128
```
129129

130-
5. Build and run the app again. Now you should be seeing two switches on the bottom of the screen as shown below, which turns on and off while you smile and show thumb-up gestures. Good job!
130+
5. Build and run the app again. Now you should see two switches on the bottom of the screen as shown below, which turn on and off while you smile and show thumb-up gestures. Good job!
131131

132132
![indicator UI](images/7/indicator%20ui.png)
133133

134134
## Recap on SharedFlow vs StateFlow
135135

136136
This app uses `SharedFlow` for dispatching overlay views' UI events without mandating a specific stateful model, which avoids redundant computation. Meanwhile, it uses `StateFlow` for dispatching condition switches' UI states, which prevents duplicated emission and consequent UI updates.
137137

138-
Here's a breakdown of the differences between `SharedFlow` and `StateFlow`:
138+
Here's a overview of the differences between `SharedFlow` and `StateFlow`:
139139

140140
| | SharedFlow | StateFlow |
141141
| --- | --- | --- |

content/learning-paths/smartphones-and-mobile/build-android-selfie-app-using-mediapipe-multimodality/8-mediate-flows.md

Lines changed: 8 additions & 31 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ weight: 8
66
layout: learningpathall
77
---
88

9-
Now you have two independent Flows indicating the conditions of face landmark detection and gesture recognition. The simplest multimodality strategy is to combine multiple source Flows into a single output Flow, which emits consolidated values as the [single source of truth](https://en.wikipedia.org/wiki/Single_source_of_truth) for its observers (collectors) to carry out corresponding actions.
9+
Now you have two independent Flows indicating the conditions of face landmark detection and gesture recognition. The simplest multimodality strategy is to combine multiple source Flows into a single output Flow, which emits consolidated values as the single source of truth for its observers (collectors) to carry out corresponding actions.
1010

1111
## Combine two Flows into a single Flow
1212

@@ -33,9 +33,9 @@ Now you have two independent Flows indicating the conditions of face landmark de
3333
```
3434

3535
{{% notice Note %}}
36-
Kotlin Flow's [`combine`](https://kotlinlang.org/api/kotlinx.coroutines/kotlinx-coroutines-core/kotlinx.coroutines.flow/combine.html) transformation is equivalent to ReactiveX's [`combineLatest`](https://reactivex.io/documentation/operators/combinelatest.html). It combines emissions from multiple observables, so that each time **any** observable emits, the combinator function is called with the latest values from all sources.
36+
Kotlin Flow's [`combine`](https://kotlinlang.org/api/kotlinx.coroutines/kotlinx-coroutines-core/kotlinx.coroutines.flow/combine.html) transformation is equivalent to ReactiveX's [`combineLatest`](https://reactivex.io/documentation/operators/combinelatest.html). It combines emissions from multiple observables, so that each time any observable emits, the combinator function is called with the latest values from all sources.
3737

38-
You might need to add `@OptIn(FlowPreview::class)` annotation since `sample` is still in preview. For more information on similar transformations, please refer to [this blog](https://kt.academy/article/cc-flow-combine).
38+
You might need to add `@OptIn(FlowPreview::class)` annotation since `sample` is still in preview.
3939

4040
{{% /notice %}}
4141

@@ -49,30 +49,7 @@ You might need to add `@OptIn(FlowPreview::class)` annotation since `sample` is
4949
.shareIn(viewModelScope, SharingStarted.WhileSubscribed())
5050
```
5151

52-
If this code looks confusing to you, please see the explanations below for Kotlin beginners.
53-
54-
{{% notice Info %}}
55-
56-
###### Keyword "it"
57-
58-
The operation `filter { it }` is simplified from `filter { bothOk -> bothOk == true }`.
59-
60-
Since Kotlin allows for implictly calling the single parameter in a lambda `it`, `{ bothOk -> bothOk == true }` is equivalent to `{ it == true }`, and again `{ it }`.
61-
62-
See [this doc](https://kotlinlang.org/docs/lambdas.html#it-implicit-name-of-a-single-parameter) for more details.
63-
64-
{{% /notice %}}
65-
66-
{{% notice Info %}}
67-
68-
###### "Unit" type
69-
This `SharedFlow` has a generic type `Unit`, which doesn't contain any value. You may think of it as a "pulse" signal.
70-
71-
The operation `map { }` simply maps the upstream `Boolean` value emitted from `_bothOk` to `Unit` regardless their values are true or false. It's simplified from `map { bothOk -> Unit }`, which becomes `map { Unit } ` where the keyword `it` is not used at all. Since an empty block already returns `Unit` implicitly, we don't need to explicitly return it.
72-
73-
{{% /notice %}}
74-
75-
If this still looks confusing, you may also opt to use `SharedFlow<Boolean>` and remove the `map { }` operation. Just note that when you collect this Flow, it doesn't matter whether the emitted `Boolean` values are true or false. In fact, they are always `true` due to the `filter` operation.
52+
You may also opt to use `SharedFlow<Boolean>` and remove the `map { }` operation. Just note that when you collect this Flow, it doesn't matter whether the emitted `Boolean` values are true or false. In fact, they are always `true` due to the `filter` operation.
7653

7754
## Configure ImageCapture use case
7855

@@ -92,7 +69,7 @@ If this still looks confusing, you may also opt to use `SharedFlow<Boolean>` and
9269
.build()
9370
```
9471

95-
3. Again, don't forget to append this use case to `bindToLifecycle`.
72+
3. Append this use case to `bindToLifecycle`.
9673

9774
```kotlin
9875
camera = cameraProvider.bindToLifecycle(
@@ -102,7 +79,7 @@ If this still looks confusing, you may also opt to use `SharedFlow<Boolean>` and
10279

10380
## Execute photo capture with ImageCapture
10481

105-
1. Append the following constant values to `MainActivity`'s companion object. They define the file name format and the [MIME type](https://en.wikipedia.org/wiki/Media_type).
82+
1. Append the following constant values to `MainActivity`'s companion object. They define the file name format and the media type:
10683

10784
```kotlin
10885
// Image capture
@@ -165,7 +142,7 @@ If this still looks confusing, you may also opt to use `SharedFlow<Boolean>` and
165142

166143
## Add a flash effect upon capturing photo
167144

168-
1. Navigate to `activity_main.xml` layout file and insert the following `View` element **between** the two overlay views and two `SwitchCompat` views. This is essentially just a white blank view covering the whole surface.
145+
1. Navigate to `activity_main.xml` layout file and insert the following `View` element between the two overlay views and two `SwitchCompat` views. This is essentially just a white blank view covering the whole surface.
169146

170147
```
171148
<View
@@ -204,6 +181,6 @@ If this still looks confusing, you may also opt to use `SharedFlow<Boolean>` and
204181
}
205182
```
206183

207-
3. Invoke `showFlashEffect()` method in `executeCapturePhoto()` method, **before** invoking `imageCapture.takePicture()`
184+
3. Invoke `showFlashEffect()` method in `executeCapturePhoto()` method, before invoking `imageCapture.takePicture()`
208185

209186
4. Build and run the app. Try keeping up a smiling face while presenting thumb-up gestures. When you see both switches turn on and stay stable for approximately half a second, the screen should flash white and then a photo should be captured and shows up in your album, which may take a few seconds depending on your Android device's hardware. Good job!

content/learning-paths/smartphones-and-mobile/build-android-selfie-app-using-mediapipe-multimodality/9-avoid-redundant-requests.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,16 +1,16 @@
11
---
2-
title: Avoid duplicated photo capture requests
2+
title: Avoid duplicate photo capture requests
33
weight: 9
44

55
### FIXED, DO NOT MODIFY
66
layout: learningpathall
77
---
88

9-
So far, we have implemented the core logic for mediating MediaPipe's face and gesture task results and executing photo captures. However, the view controller does not communicate its execution results back to the view model. This introduces risks such as photo capture failures, frequent or duplicate requests, and other potential issues.
9+
So far, you have implemented the core logic for MediaPipe's face and gesture task results and executing photo captures. However, the view controller does not communicate its execution results back to the view model. This introduces risks such as photo capture failures, frequent or duplicate requests, and other potential issues.
1010

1111
## Introduce camera readiness state
1212

13-
It is a best practice to complete the data flow cycle by providing callbacks for the view controller's states. This ensures that the view model does not emit values in undesired states, such as when the camera is busy or unavailable.
13+
It is best practice to complete the data flow cycle by providing callbacks for the view controller's states. This ensures that the view model does not emit values in undesired states, such as when the camera is busy or unavailable.
1414

1515
1. Navigate to `MainViewModel` and add a `MutableStateFlow` named `_isCameraReady` as a private member variable. This keeps track of whether the camera is busy or unavailable.
1616

@@ -58,7 +58,7 @@ The duration of image capture can vary across Android devices due to hardware di
5858

5959
To address this, implementing a simple cooldown mechanism after each photo capture can enhance the user experience while conserving computing resources.
6060

61-
1. Add the following constant value to `MainViewModel`'s companion object. This defines a `3` sec cooldown before marking the camera available again.
61+
1. Add the following constant value to `MainViewModel`'s companion object. This defines a 3 seconds cooldown before making the camera available again.
6262

6363
```kotlin
6464
private const val IMAGE_CAPTURE_DEFAULT_COUNTDOWN = 3000L
@@ -91,6 +91,6 @@ However, silently failing without notifying the user is not a good practice for
9191

9292
## Completed sample code on GitHub
9393

94-
If you run into any difficulties completing this learning path, feel free to check out the [completed sample code](https://github.com/hanyin-arm/sample-android-selfie-app-using-mediapipe-multimodality) and import it into Android Studio.
94+
If you run into any difficulties completing this learning path, you can check out the [complete sample code](https://github.com/hanyin-arm/sample-android-selfie-app-using-mediapipe-multimodality) and import it into Android Studio.
9595

96-
If you discover a bug, encounter an issue, or have suggestions for improvement, we’d love to hear from you! Please feel free to [open an issue](https://github.com/hanyin-arm/sample-android-selfie-app-using-mediapipe-multimodality/issues/new) with detailed information.
96+
If you discover a bug, encounter an issue, or have suggestions for improvement, please feel free to [open an issue](https://github.com/hanyin-arm/sample-android-selfie-app-using-mediapipe-multimodality/issues/new) with detailed information.

0 commit comments

Comments
 (0)