Replies: 3 comments 1 reply
-
Não posso te ajudar!
Sorry.
Em ter., 26 de nov. de 2024, 14:57, Marcelo Rovai ***@***.***>
escreveu:
… I would like to know how I can run SocratiQ AI locally. When rendering the
book locally, it does not work. Also, if it is possible to use a smaller
open model, a quantized LLama, for example). Thanks
—
Reply to this email directly, view it on GitHub
<#535>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/BJKLEM2M2GQKLCENHFDDHJT2CSZCPAVCNFSM6AAAAABSRA53GOVHI2DSMVQWIX3LMV43ERDJONRXK43TNFXW4OZXGU3DANBTHA>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
This is a good ask. @kai4avaya is still working on the documentation, and we will be sure to capture them there and possibly provide the ability to run things locally. A simple local endpoint API would suffice I would think. |
Beta Was this translation helpful? Give feedback.
-
@profvjreddi Are there plans to port SocratiQ onto DeepSeek and would that potentially reduce the cost of scaling significantly? I am an entrepreneur and part-time high school teacher at an underserved high school in Chicago interested in democratizing access to premium education such as AP exam prep currently not affordable for many of our students. LinkedIn handle: tomslee99 |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I would like to know how I can run SocratiQ AI locally. When rendering the book locally, it does not work. Also, if it is possible to use a smaller open model, a quantized LLama, for example). Thanks
Beta Was this translation helpful? Give feedback.
All reactions