You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
`Supabase.ai` uses [onnxruntime](https://onnxruntime.ai/) as internal model
17
+
execution engine, backend by [ort pyke](https://ort.pyke.io/) rust bindings.
18
+
19
19
Following there's specific documentation for both "lands":
20
20
21
21
<details>
22
22
<summary>Javascript/Frontend</summary>
23
+
24
+
The **onnxruntime** API is available from `globalThis` and shares similar specs of [onnxruntime-common](https://github.com/microsoft/onnxruntime/tree/main/js/common).
25
+
26
+
The available items are:
27
+
28
+
-`Tensor`: represent a basic tensor with specified dimensions and data type. -- "The AI input/output"
29
+
-`InferenceSession`: represent the inner model session. -- "The AI model itself"
30
+
31
+
### Usage
32
+
33
+
It can be used from the exported `globalThis[Symbol.for("onnxruntime")]` --
34
+
but manipulating it directly is not trivial, so in the future you may use the [Inference API #501](https://github.com/supabase/edge-runtime/pull/501) for a more user friendly API.
Originaly this backend was created to implicit integrate with [transformers.js](https://github.com/huggingface/transformers.js/). This way users can still consuming a high-level lib at same time they benefits of all Supabase's Model Execution Engine features, like model optimization and caching. For further information pleas check the [PR #436](https://github.com/supabase/edge-runtime/pull/436)
57
+
58
+
> [!WARNING]
59
+
> At this moment users need to explicit target `device: 'auto'` to enable the platform compatibility.
0 commit comments