Replies: 4 comments 9 replies
-
Hi thanks for bringing this up. What is the name of the model when you select it from the web UI? As long as you include On OpenAI EndpointOn Azure Endpoint |
Beta Was this translation helpful? Give feedback.
-
@danny-avila Thank you for looking into this! I pulled the latest github version, and noticed that it's correctly showing in:
but there is no 'gpt-4-turbo-preview' I cannot replicate it with the latest pulled version (latest branch from github), however, I do have the debug log from the previous version and the run:
You can see the model is Is it possible that the gpt 4 turbo preview is not linking to the "gpt-4-1106" nor "gpt-4-turbo" (which from the tokens.js, both look correct)? |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
On a related note, what is the recommended location for setting the max token value for a custom model using the chat UI? It looks like it's all based on the name or a default - do I just need to do it in the script? Thanks! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
What is your question?
Hi - curious how/where one would set the variable
maxPromptTokens
for OpenAI for example?More Details
This came about because we noticed that gpt4-turbo-preview is set to a 8K input context window, however the model supports 128K.
Not sure if this is a mistake, or intentional, but we want to set it to 128K
Thank you!
What is the main subject of your question?
Endpoints
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions