You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/learning-paths/mobile-graphics-and-gaming/Vision-LLM-inference-on-Android-with-KleidiAI-and-MNN/1-devenv-and-model.md
+26-23Lines changed: 26 additions & 23 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,53 +5,57 @@ weight: 3
5
5
### FIXED, DO NOT MODIFY
6
6
layout: learningpathall
7
7
---
8
+
## Install Required Software
8
9
9
-
In this section, you will set up a development environment by installing dependencies and preparing the Qwen vision model.
10
+
In this section, you'll set up your development environment by installing dependencies and preparing the Qwen vision model.
10
11
11
-
## Install required software
12
+
Install the Android NDK (Native Development Kit) and git-lfs. This Learning Path was tested with NDK version `28.0.12916984` and CMake version `4.0.0-rc1`.
12
13
13
-
Install the Android NDK (Native Development Kit) and git-lfs. This learning path was tested with NDK version `28.0.12916984` and CMake version `4.0.0-rc1`.
14
-
15
-
For Ubuntu or Debian systems, you can install CMake and git-lfs with the following command:
14
+
For Ubuntu or Debian systems, install CMake and git-lfs with the following commands:
16
15
17
16
```bash
18
17
sudo apt update
19
18
sudo apt install cmake git-lfs -y
20
19
```
21
20
22
-
You can use Android Studio to obtain the NDK. Click **Tools > SDK Manager**, and navigate to the the SDK Tools tab. Select the NDK (Side by side) and CMake checkboxes, as shown below:
21
+
Alternatively, you can use Android Studio to obtain the NDK.
22
+
23
+
Click **Tools > SDK Manager** and navigate to the **SDK Tools** tab.
24
+
25
+
Select the **NDK (Side by side)** and **CMake** checkboxes, as shown below:
23
26
24
27

25
28
26
-
Refer to[Install NDK and CMake](https://developer.android.com/studio/projects/install-ndk) for other installation methods.
29
+
See[Install NDK and CMake](https://developer.android.com/studio/projects/install-ndk) for other installation methods.
27
30
28
-
Make sure Python and pip is installed by verifying a version is printed on running this command:
31
+
Ensure that Python and pip are installed by verifying the version with these commands:
29
32
30
33
```bash
31
34
python --version
32
35
pip --version
33
36
```
34
37
35
38
{{% notice Note %}}
36
-
The above commands may fail when Python is installed if Python 3.x is not the default version. You can try running `python3 --version` and `pip3 --version` to be sure.
39
+
If Python 3.x is not the default version, try running `python3 --version` and `pip3 --version`.
37
40
{{% /notice %}}
38
41
39
-
## Set up phone connection
42
+
## Set up Phone Connection
40
43
41
-
You will need to set up an authorized connection with your phone. The Android SDK Platform Tools package, included in Android Studio, comes with Android Debug Bridge (ADB). You will use this tool to transfer files later on.
44
+
You need to set up an authorized connection with your phone. The Android SDK Platform Tools package, included with Android Studio, provides Android Debug Bridge (ADB) for transferring files.
42
45
43
-
Connect your phone to the computer using a USB cable. You will need to activate USB debugging on your phone. Find the **Build Number** in your **Settings** app and tap it 7 times. Then, enable **USB debugging** in **Developer Options**.
46
+
Connect your phone to your computer using a USB cable, and enable USB debugging on your phone. To do this, tap the **Build Number** in your **Settings** app 7 times, then enable **USB debugging** in **Developer Options**.
44
47
45
-
You should now see your device listed upon running `adb devices`:
48
+
Verify the connection by running `adb devices`:
46
49
47
50
```output
48
51
List of devices attached
49
52
<DEVICE ID> device
50
53
```
54
+
You should see your device listed.
51
55
52
-
## Download and convert the model
56
+
## Download and Convert the Model
53
57
54
-
The following commands download the model from Hugging Face, and clones a tool for exporting LLM model to the MNN framework.
58
+
The following commands download the model from Hugging Face, and clone a tool for exporting the LLM model to the MNN framework.
|`--sym`|Symmetric quantization (without zeropoint); default is False. | The quantization parameter that enables symmetrical quantization. |
78
81
79
-
To learn more about the parameters, refer to the [transformers README.md](https://github.com/alibaba/MNN/tree/master/transformers).
82
+
To learn more about the parameters, see the [transformers README.md](https://github.com/alibaba/MNN/tree/master/transformers).
80
83
81
-
Verify the model is built correct by checking the size of the resulting model. The `Qwen2-VL-2B-Instruct-convert-4bit-per_channel` directory should be at least 1 GB in size.
84
+
Verify that the model was built correctly by checking that the `Qwen2-VL-2B-Instruct-convert-4bit-per_channel` directory is at least 1 GB in size.
Copy file name to clipboardExpand all lines: content/learning-paths/mobile-graphics-and-gaming/Vision-LLM-inference-on-Android-with-KleidiAI-and-MNN/2-generate-apk.md
+13-7Lines changed: 13 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,11 +6,13 @@ weight: 4
6
6
layout: learningpathall
7
7
---
8
8
9
-
In this section, you will try the Qwen model in action using a demo application using a Android Package Kit (APK)
10
-
11
9
## Clone MNN repo
12
10
13
-
A fork of the upstream MNN repository is set up to enable building the app as an Android Studio project. Run the following to clone the repository and checkout the source tree:
11
+
In this section, you will run the Qwen model in action using a demo application using a Android Package Kit (APK).
12
+
13
+
A fork of the upstream MNN repository is set up to enable building the app as an Android Studio project.
14
+
15
+
Run the following commands to clone the repository and checkout the source tree:
14
16
15
17
```bash
16
18
cd$HOME
@@ -19,19 +21,23 @@ cd MNN
19
21
git checkout origin/llm_android_demo
20
22
```
21
23
22
-
## Build the app using Android Studio
24
+
## Build the App Using Android Studio
23
25
24
26
### Open project and build
25
27
26
-
Open Android Studio. Go to **File > Open**. Navigate to the MNN repository you just cloned. Expand the `transformers/llm/engine/` directories, select the `android` one and click `Open`.
28
+
Open Android Studio.
29
+
30
+
Go to **File > Open**.
31
+
32
+
Navigate to the cloned MNN repository, expand the `transformers/llm/engine/` directories, select the `android` directory, and click `Open`.
27
33
28
-
This will trigger a build of the project, and you should see a similar output on completion:
34
+
This triggers a build of the project, and you should see output similar to the following on completion:
29
35
30
36
```output
31
37
BUILD SUCCESSFUL in 1m 42s
32
38
```
33
39
34
-
### Generate and run the APK
40
+
### Generate and Run the APK
35
41
36
42
Navigate to **Build > Generate App Bundles or APKs**. Select **Generate APKs**.
Copy file name to clipboardExpand all lines: content/learning-paths/mobile-graphics-and-gaming/Vision-LLM-inference-on-Android-with-KleidiAI-and-MNN/3-benchmark.md
+32-18Lines changed: 32 additions & 18 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,12 +5,13 @@ weight: 5
5
5
### FIXED, DO NOT MODIFY
6
6
layout: learningpathall
7
7
---
8
+
## Prepare an Example Image
8
9
9
-
In this section, you will use the model to benchmark performance with and without KleidiAI kernels. You will need to compile library filesto run the optimized inference.
10
+
In this section, you'll benchmark model performance with and without KleidiAI kernels. To run optimized inference, you'll first need to compile the required library files. You'll also need an example image to run command-line prompts.
10
11
11
-
## Prepare an example image
12
+
You can use the provided image of the tiger below that this Learning Path uses, or choose your own.
12
13
13
-
You will use an image to run a command-line prompt. In this learning path, the tiger below will be used as an example. You can save this image or provide one of your own. Re-name the image to `example.png` in order to use the commands in the following sections.
14
+
Whichever you select, rename the image to `example.png` to use the commands in the following sections.
14
15
15
16

16
17
@@ -20,9 +21,13 @@ Use ADB to load the image onto your phone:
20
21
adb push example.png /data/local/tmp/
21
22
```
22
23
23
-
## Build binaries for command-line inference
24
+
## Build Binaries for Command-line Inference
24
25
25
-
Navigate to the MNN project you cloned in the previous section. Create a build directory and run the script. The first time, you will build the binaries with the `-DMNN_KLEIDIAI` flag set to `FALSE`.
26
+
Navigate to the MNN project that you cloned in the previous section.
27
+
28
+
Create a build directory and run the build script.
29
+
30
+
The first time that you do this, build the binaries with the `-DMNN_KLEIDIAI` flag set to `FALSE`.
If your NDK toolchain isn't set up correctly, you may run into issues with the above script. Make note of where the NDK was installed - this will be a directory named after the version you downloaded earlier. Try exporting the following environment variables before re-running `build_64.sh`.
42
+
If your NDK toolchain isn't set up correctly, you might run into issues with the above script. Make a note of where the NDK was installed - this will be a directory named after the version you downloaded earlier. Try exporting the following environment variables before re-running `build_64.sh`:
The following commands should be run in the ADB shell. Navigate to the directory you pushed the files to, add executable permissions to the `llm_demo` file and export an environment variable for it to run properly. After this, use the example image you transferred earlier to create a file containing the text content for the prompt.
61
+
Run the following commands in the ADB shell. Navigate to the directory you pushed the files to, add executable permissions to the `llm_demo` file and export an environment variable for it to run properly. After this, use the example image you transferred earlier to create a file containing the text content for the prompt.
55
62
56
63
```bash
57
64
cd /data/local/tmp/
@@ -60,13 +67,13 @@ export LD_LIBRARY_PATH=$PWD
60
67
echo"<img>./example.png</img>Describe the content of the image."> prompt
61
68
```
62
69
63
-
Finally, run an inference on the model with the following command.
70
+
Finally, run an inference on the model with the following command:
If the launch is successful, you should see the following output, with the performance benchmark at the end.
76
+
If the launch is successful, you should see the following output, with the performance benchmark at the end:
70
77
71
78
```output
72
79
config path is models/Qwen-VL-2B-convert-4bit-per_channel/config.json
@@ -86,34 +93,39 @@ prefill speed = 192.28 tok/s
86
93
##################################
87
94
```
88
95
89
-
## Enable KleidiAI and re-run inference
96
+
## Enable KleidiAI and Re-run Inference
90
97
91
-
The next step is to re-generate the binaries with KleidiAI activated. This is done by updating the flag `-DMNN_KLEIDIAI` to `TRUE`. From the `build_64` directory, run:
98
+
The next step is to re-generate the binaries with KleidiAI activated. This is done by updating the flag `-DMNN_KLEIDIAI` to `TRUE`.
The next step is to update the files on your phone. Start by removing the ones used in the previous step. Then, push the new ones with the same command as before.
109
+
First, remove existing binaries from your Android device, then push the updated files:
This time, you should see an improvement in the benchmark. Below is an example table showing the uplift on three relevant metrics after enabling the KleidiAI kernels.
142
+
This time, you should see an improvement in the benchmark. Below is an example table showing the uplift on three relevant metrics after enabling the KleidiAI kernels:
The prefill speed describes how fast the model processes the input prompt. The decode speed corresponds to the rate at which the model generates new tokens after the input is processed
150
+
**Prefill speed** describes how fast the model processes the input prompt.
151
+
152
+
**Decode Speed** indicates how quickly the model generates new tokens after the input is processed.
139
153
140
-
This shows the advantages of using Armoptimized kernels for your ViT use-cases.
154
+
These benchmarks clearly demonstrate the performance advantages of using Arm-optimized KleidiAI kernels for vision transformer (ViT) workloads.
Copy file name to clipboardExpand all lines: content/learning-paths/mobile-graphics-and-gaming/Vision-LLM-inference-on-Android-with-KleidiAI-and-MNN/_index.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,18 +3,18 @@ title: Vision LLM inference on Android with KleidiAI and MNN
3
3
4
4
minutes_to_complete: 30
5
5
6
-
who_is_this_for: This learning path is for developers who want to run Vision Transformers (ViT) efficiently on an Android device.
6
+
who_is_this_for: This Learning Path is for developers who want to run Vision Transformers (ViT) efficiently on Android.
7
7
8
8
learning_objectives:
9
-
- Download the a Vision Large Language Model (LLM) from Hugging Face.
9
+
- Download a Vision Large Language Model (LLM) from Hugging Face.
10
10
- Convert the model to the Mobile Neural Network (MNN) framework.
11
-
- Install an Android demo application with the model to run an inference.
12
-
- Compare model inference performance with and without KleidiAI Armoptimized micro-kernels.
11
+
- Install an Android demo application using the model to run an inference.
12
+
- Compare inference performance with and without KleidiAI Arm-optimized micro-kernels.
13
13
14
14
15
15
prerequisites:
16
16
- A development machine with [Android Studio](https://developer.android.com/studio) installed.
17
-
- A 64-bit Armpowered smartphone running Android with `i8mm` and `dotprod` supported.
17
+
- A 64-bit Arm-powered smartphone running Android with support for `i8mm` and `dotprod`. supported.
18
18
19
19
author:
20
20
- Shuheng Deng
@@ -36,7 +36,7 @@ operatingsystems:
36
36
37
37
further_reading:
38
38
- resource:
39
-
title: "MNN: A UNIVERSAL AND EFFICIENT INFERENCE ENGINE"
39
+
title: "MNN: A Universal and Efficient Inference Engine"
0 commit comments