Replies: 2 comments 3 replies
-
Looking through the code of LiteLLM, how the list files endpoint work, it still seems to try to fetch the |
Beta Was this translation helpful? Give feedback.
0 replies
-
adding the following query parameter to the get request works for me (v1.60.0): provider=azure |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hey all!
I'm trying to understand how to setup the
config.yaml
file so that all the file related endpoints work as expected.What I would wish is that users of the litellm setup that I have can use the OpenAI python client, set the
base_url
andapi_key
to the LiteLLM and simply use the methods as-is. Is it possible to do this so that users doesn't have to include theextra_body={"custom_llm_provider": "azure"}
parameter?Secondly: I am able to make the upload of files work when I set the config to the following:
With this config, I'm able to run
and I get a proper output:
but I have to provide the extra
-F custom_llm_provider="azure"
which I'd like to not have to.Also, running
or
both give the same error:
with the same above config: So it works to upload files, but not to fetch files. What is it I am missing? I have tried to find documentation on this but unable to find any.
Thank you! 🙌
Beta Was this translation helpful? Give feedback.
All reactions