You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: ext/ai/README.md
+10-11Lines changed: 10 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -16,17 +16,15 @@ features for the `Supabase.ai` namespace.
16
16
`Supabase.ai` uses [onnxruntime](https://onnxruntime.ai/) as internal model
17
17
execution engine, backend by [ort pyke](https://ort.pyke.io/) rust bindings.
18
18
19
-
Following there's specific documentation for both "lands":
20
-
21
19
<details>
22
-
<summary>Javascript/Frontend</summary>
20
+
<summary>Javascript docs</summary>
23
21
24
22
The **onnxruntime** API is available from `globalThis` and shares similar specs of [onnxruntime-common](https://github.com/microsoft/onnxruntime/tree/main/js/common).
25
23
26
24
The available items are:
27
25
28
-
-`Tensor`: represent a basic tensor with specified dimensions and data type. -- "The AI input/output"
29
-
-`InferenceSession`: represent the inner model session. -- "The AI model itself"
26
+
-`Tensor`: Represent a basic tensor with specified dimensions and data type. -- "The AI input/output"
27
+
-`InferenceSession`: Represent the inner model session. -- "The AI model itself"
30
28
31
29
### Usage
32
30
@@ -53,7 +51,7 @@ console.log(last_hidden_state);
53
51
54
52
### Third party libs
55
53
56
-
Originaly this backend was created to implicit integrate with [transformers.js](https://github.com/huggingface/transformers.js/). This way users can still consuming a high-level lib at same time they benefits of all Supabase's Model Execution Engine features, like model optimization and caching. For further information pleas check the [PR #436](https://github.com/supabase/edge-runtime/pull/436)
54
+
Originaly this backend was created to implicit integrate with [transformers.js](https://github.com/huggingface/transformers.js/). This way users can still consuming a high-level lib at same time they benefits of all Supabase's Model Execution Engine features, like model optimization and caching. For further information please check the [PR #436](https://github.com/supabase/edge-runtime/pull/436)
57
55
58
56
> [!WARNING]
59
57
> At this moment users need to explicit target `device: 'auto'` to enable the platform compatibility.
@@ -98,10 +96,11 @@ setInterval(async () => {
98
96
99
97
</details>
100
98
101
-
<details>
102
-
<summary>Rust/Backend</summary>
103
-
</details>
99
+
## The `Session` class
104
100
105
-
onnxruntime:
101
+
Prior versions has [introduced](https://supabase.com/blog/ai-inference-now-available-in-supabase-edge-functions) the `Session` class as alternative to `transformers.js` for *gte-small* model and then was used to provide a [LLM interface](https://supabase.com/docs/guides/functions/ai-models?queryGroups=platform&platform=ollama#using-large-language-models-llm) for Ollama and some other providers.
106
102
107
-
the Session class:
103
+
Since the **Model Execution Engine** was created the `Session` class now can focus on LLM interface while the `Session('gte-small')` is for compatibility purposes only.
104
+
105
+
> [!WARNING]
106
+
> Docs for Session class will end here - There's a open [PR #539](https://github.com/supabase/edge-runtime/pull/539) that may change a lot of things for it.
0 commit comments