Integration with other LLM models besides OpenAi #72
Replies: 1 comment
-
Hi @anibalbezerra! Thanks for the appreciation ! The setup is straightforward - just configure your endpoint URL and API key, then call Hope this clarifies the integration! Let me know if you need help with a specific provider setup. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Dear developers,
First of all, a huge shout-out for such an amazing project! I've read the docs and was unable to discern if the Memori module works with other LLM providers. I'm not referring to the integration (watching side) with the LLM in the project running Memori; I'm talking about the memory management per se.
Thanks for your time.
Best
Anibal
Beta Was this translation helpful? Give feedback.
All reactions