- If you use Keras Convolution layer as input,
data_formatmust bechannels_last. At the time of this writing (2019-07-24), Tensorflow Lite will throw error when you try to convert CNN model withdata_formatischannels_firsttotflitemodel.
- Convert your model to
tflite.
- Example with Keras model.
model = tf.keras.models.load_model('model.h5')
converter = tf.lite.TFLiteConverter.from_keras_model(model)
tflite = converter.convert()
tflite_filename = 'model.tflite'
open(tflite_filename, 'wb').write(tflite)-
Create an android project.
-
Add 'tensorflow-lite' to
app/build.gradledependencies.
-
Important:
- Add dependencies to the
build.gradlein theappfolder, not thebuild.gradlein the root project. - You will also need to install Android NDK
- Add dependencies to the
-
Example
app\build.gradle
...
dependencies {
...
// Tensorflow Lite library
implementation 'org.tensorflow:tensorflow-lite:0.0.0-nightly'
}
...
-
Put your
tflitemodel in theassetsfolder. It should locate atapp/src/main/assets. -
Tell Gradle not to compress our
tflitemodel.
-
Add
noCompress 'tflite'toapp\build.gradle. -
Example
app\build.gradle
...
android {
...
aaptOptions {
noCompress 'tflite'
}
}
...
- Create an
InterpreterinJavaorKotlin
After you add tensorflow-lite to project's dependencies and sync Gradle, you should able to import org.tensorflow.lite.Interpreter.
The Keras model equivalent in TFLite is an Interpreter.
To create and Interpreter you will need the TFLite model (in form of MappedByteBuffer) and the Interpreter.Options object.
In order to run inference (or model.predict in Keras) you will need to pre-allocated a ByteBuffer for the input and a float[] (depend on your model) for the output. The input ByteBuffer will need to be fill with your data before calling Interpreter.run(input, output).
- Tensorflow official example - too complicated for a simple Android app that perform image classification task. However, they do help with my implementation. You just need to find the right file name to read.
- Tensorflow Lite documentation.
