You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The goal of lite.ai.toolkit is not to abstract on top of MNN and ORT. So, you can use lite.ai.toolkit mixed with MNN(-DENABLE_MNN=ON) or ORT. The lite.ai.toolkit installation package contains complete MNN and ORT. The workflow may looks like:
82
+
## Mixed with MNN or ONNXRuntime 👇👇
83
+
The goal of lite.ai.toolkit is not to abstract on top of MNN and ONNXRuntime. So, you can use lite.ai.toolkit mixed with MNN(`-DENABLE_MNN=ON, default OFF`) or ONNXRuntime(`-DENABLE_ONNXRUNTIME=ON, default ON`). The lite.ai.toolkit installation package contains complete MNN and ONNXRuntime. The workflow may looks like:
84
84
```C++
85
85
#include"lite/lite.h"
86
86
// 0. use yolov5 from lite.ai.toolkit to detect objs.
87
87
auto *yolov5 = new lite::cv::detection::YoloV5(onnx_path);
88
-
// 1. use naive OnnxRuntime or MNN to implement classfier.
88
+
// 1. use OnnxRuntime or MNN to implement your own classfier.
// 2. classify the detected objs use your own classfier ...
92
+
// 2. then, classify the detected objs use your own classfier ...
93
93
```
94
-
The included headers of MNN and ORT can be found at [mnn_config.h](./lite/mnn/core/mnn_config.h) and [ort_config.h](./lite/ort/core/ort_config.h).
94
+
The included headers of MNN and ONNXRuntime can be found at [mnn_config.h](./lite/mnn/core/mnn_config.h) and [ort_config.h](./lite/ort/core/ort_config.h).
95
+
95
96
<details>
96
97
<summary> 🔑️ Check the output log!Click here! </summary>
0 commit comments