how text_completion(model)
knows if the passed model is a text-completion
or chat-completion
mode?
#930
Replies: 2 comments 1 reply
-
moving this to a discussion. Is there a specific model you have in mind as you say this? @solyarisoftware |
Beta Was this translation helpful? Give feedback.
-
So, take for example this basic program: # !pip install -U litellm
# https://docs.litellm.ai/docs/tutorials/azure_openai
# https://docs.litellm.ai/docs/tutorials/text_completion
from litellm import text_completion
prompt = 'Write a tagline for a traditional italian tavern, as a sentence is in Italian language, without quotes'
response = text_completion(
model="azure/gpt-35-turbo-0613",
prompt=prompt,
temperature=0.7,
max_tokens=100,
stop=None
)
print(prompt)
print('=>')
print(response['choices'][0]['text']) In the above case, the model is an Azure deployment associated with the Openai model type But what if I'm passing to the text_completion() a deployment or model that is naively a I hope to have been more clear in the above explanation. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Here to ask clarifications for a possible bug.
In my application I'm successfully using LiteLLM text_completion() method:
https://github.com/solyarisoftware/prompter.vim/blob/master/python/llm.py#L35
BTW, in my specific use cases / environment, I'm using an Azure deployment called "azure/gpt35-turbo" that, as the name suggests, is enabled by the OpenAI
chat-completion
mode model GPT3.5-Turbo.text_completion() set the prompt argument as a system prompt. Fine.
My question is:
What if the model passed in enabled by a TEXT-MODE model?
By example, just to refers to OpenAI LLM provider TEXT-MODE model (compatible with legacy Completions endpoint and not Chat Completions.), examples are:
text_da_vinci_003
gpt-3.5-turbo-instruct
Reading the text_completion() implementation code:
litellm/litellm/main.py
Line 1861 in f6d40c2
I do not understand how the function could know is the model is a
chat-completion
or atext-completion
model and consequently call the correct legacy Completion endpoint and not Chat Completion.My expectation/wish is that text_completion() would work also for ANY model (also for all NO-OpenAI
text-completion
models). I fair text_completion() lacking of an argument that specify if the model ischat-completion
ortext-completion
, fails fortext-completion
models. Isn't it?.Please let me know
giorgio
Twitter / LinkedIn details
twitter: @solyarisoftare linkedin: www.linkedin.com/in/giorgiorobino
Beta Was this translation helpful? Give feedback.
All reactions