-
-
Notifications
You must be signed in to change notification settings - Fork 80
Description
This is similar to something mentioned in #16, but the codebase seems to have changed since then, so I thought I'd create a new issue.
I'd like to try using this library with one of my models, which has a dynamic input size ([?, ?, ?, 1] / [-1, -1, -1, 1] on the first layer, which is later resized in ResizeBilinear). However, when I called model.run, I got the following error, which I assume is due to the library not handling this first layer size as dynamic:
TFLite: Input Buffer size (16384) does not match the Input Tensor's expected size (4)! Make sure to resize the input values accordingly.
I am passing in an image as a grayscale tfCore.Tensor, img (64 x 64 x 1), which I am converting to a Float32Array with a new Float32Array(await img.data()). Is there any way currently to use models with dynamic size? I'm also willing to look into implementing this myself; any tips in that regard would also be appreciated.