Bedrock and Azure support #371
augustintsang
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone,
I had a few questions and proposals regarding provider support and client configuration in the gollm implementation:
Amazon Bedrock support – Is anyone currently working on an implementation for Bedrock as a provider (similar to openai, ollama, etc)? If not, I’d be happy to take the lead on contributing this.
Azure OpenAI support – Is the streaming functionality being worked on currently?
Making model configuration less hardcoded – Currently, the model is selected using a mix of env vars and local getOpenAIModel() logic. Would it be acceptable to pass more of these LLM config parameters (api_key, base_url, model) through ClientOptions when instantiating a client, and override env-based defaults there?
Let me know if there's already work in progress here or if you'd like me to open an issue/PR!
Thanks,
Augustin
Beta Was this translation helpful? Give feedback.
All reactions