[Feature] Support Hailo for machine learning hardware acceleration #17230
Replies: 4 comments 1 reply
-
I would appreciate implementing the Hailo AI HAT/Kit. This would enable a lot of small installations machine learning capabilities at a moderate invest. Implementing this seems pretty obvious to me. |
Beta Was this translation helpful? Give feedback.
-
According to the Product sheet it should be compatible with TensorFlow, TensorFlow Lite, Keras, PyTorch and ONNX. |
Beta Was this translation helpful? Give feedback.
-
There were some compatibility issues when compiling our model catalog to Hailo format when I last played around with this, but that was almost a year ago. They might have better OOTB support for these models these days. |
Beta Was this translation helpful? Give feedback.
-
Hailo 8 support would be nice, using a Hailo 8 pcie Card. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I have searched the existing feature requests, both open and closed, to make sure this is not a duplicate request.
The feature
I run Immich on a Raspberry Pi 5 and would like to connect a Raspberry Pi AI Hat to improve machine learning performance. From what I understand, this would require supporting the Hailo modules for hardware acceleration. I suspect the Raspberry Pi 5 is a common entrypoint for people starting on their self-hosting journey (like myself!) so supporting the RPi AI hardware would be valuable.
The relevant hardware is currently:
There was a related discussion here but that discussion is closed and the Hailo part was a side-discussion, so opening a new one to see if there has been any progress since then.
Thanks, in anticipation 💚
Platform
Beta Was this translation helpful? Give feedback.
All reactions