You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Choosing the right hosting approach is critical. Cloud models offer powerful capabilities but come with privacy concerns, while local models improve control but require infrastructure.
The current setup via configurable providers (API types - openAI, anthropic, etc) allow you to configure base API URL and in addition support LiteLLM which can proxy request to pretty much any model hosted locally or in the cloud.
Could a hybrid model be the best approach?
How do we efficiently manage compute costs while maintaining high performance? (LiteLLM?)
What are the best ways to optimize AI hosting for Odoo workflows?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
Choosing the right hosting approach is critical. Cloud models offer powerful capabilities but come with privacy concerns, while local models improve control but require infrastructure.
The current setup via configurable providers (API types - openAI, anthropic, etc) allow you to configure base API URL and in addition support LiteLLM which can proxy request to pretty much any model hosted locally or in the cloud.
Beta Was this translation helpful? Give feedback.
All reactions