When run with liveness detection branch, Spoofing model interpretion issues on model , give this error. when i am used 2.14.0 for tflite defendency versions, my vivo device working but samsung device with casting issue like int32 but not float32 from model. What's the correct conversion from model
Error:
Caused by: java.lang.IllegalArgumentException: Internal error: Cannot create interpreter: tensorflow/lite/kernels/conv.cc:350 input_channel % filter_input_channel != 0 (1 != 0)
Node number 1 (CONV_2D) failed to prepare.
at org.tensorflow.lite.NativeInterpreterWrapper.createInterpreter(Native Method)
at org.tensorflow.lite.NativeInterpreterWrapper.init(NativeInterpreterWrapper.java:106)
at org.tensorflow.lite.NativeInterpreterWrapper.(NativeInterpreterWrapper.java:73)
at org.tensorflow.lite.NativeInterpreterWrapperExperimental.(NativeInterpreterWrapperExperimental.java:36)
at org.tensorflow.lite.Interpreter.(Interpreter.java:227)
at ltd.v2.gp_me.FaceSpoofDetector.(FaceSpoofDetector.kt:56)
at ltd.v2.gp_me.FrameAnalyser.(FrameAnalyser.kt:51)
at ltd.v2.gp_me.FaceRecognizationActivity.onCreate(FaceRecognizationActivity.kt:116)
at android.app.Activity.performCreate(Activity.java:7147)
at android.app.Activity.performCreate(Activity.java:7138)
at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1219)
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2885)