Commit e6de713
Fix in load_llm.py (#1508)
Fixed an issue where the "proxy" setting was passed to the PublicOpenAPI constructor instead of the "api_base" parameter, disabling the use of on-premise OpenAI-based LLM servers
Co-authored-by: Alonso Guevara <[email protected]>
Co-authored-by: Josh Bradley <[email protected]>1 parent c450f85 commit e6de713
File tree
0 file changed
+0
-0
lines changed0 file changed
+0
-0
lines changed
0 commit comments