Replies: 2 comments 7 replies
-
I haven't tried by myself. As far as I know, WL does not support local LLMs (@KirillBelovTest might correct me on that). But the handcrafting an adapter can be something... https://community.wolfram.com/groups/-/m/t/3235028 |
Beta Was this translation helpful? Give feedback.
0 replies
-
Or do you mean to use local LLM as a notebook assistant? |
Beta Was this translation helpful? Give feedback.
7 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I see that it is possible to access LLM's through their API's. How about accessing local LM's in WLJS?
Beta Was this translation helpful? Give feedback.
All reactions