You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
I can't get the Ollama integration to work with the basic settings (http://localhost:11434 on a Mac with default Ollama app).
The cli works perfectly fine.
I have tried the two models described in the docs (although I just read in the Issues that for embeddings, it's not going to work).
I get a 405 Error when using the chat from Trilium
If I look on Internet, it happens on certain models, and it was a bug that was supposed to be solved a long time ago (like in llama 0.3.5 or something)
I also read that the models needed to have the "tools" specification, so I tried some as well.
I checked 'Expose Ollama to the network', without luck.
If I try to fetch() from a JS Frontend script, it won't work because of CORS issues. (Note: I just converted the curl command from the main page of ollama's github, not sure it is the right thing to do.
I tried adding all the CORS-enabling environment varibales in a trilium launcher, but it was for the sake of having tried it, since it is supposed to help the other way around, right?
I also tried to minify ollama.js (which was not supposed to be feasible due to the structure of the project, so ChatGPT made me cheat but I don't understand what was done) and integrate it like a class (like the renderChart.js exemple in the demo of Trilium), but here, well, it was just a matter of trying my luck. I had the same problem as I have evrytime I don't understand the structure of the inner js you're not allowed to use imports outside of a module
the idea behind it was to benefit from a streamed communication that may not be obvious using basic fetch()calls.
Anyways, I realize that I may not have tried 'JS Backend', but in the meantime, is there a known special trick?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I can't get the Ollama integration to work with the basic settings (http://localhost:11434 on a Mac with default Ollama app).
405 Error
when using the chat from Triliumfetch()
from aJS Frontend
script, it won't work because of CORS issues. (Note: I just converted the curl command from the main page of ollama's github, not sure it is the right thing to do.you're not allowed to use imports outside of a module
streamed
communication that may not be obvious using basicfetch()
calls.Anyways, I realize that I may not have tried 'JS Backend', but in the meantime, is there a known special trick?
Beta Was this translation helpful? Give feedback.
All reactions