Conversation
Add GPT_3_5_TURBO_0125 and update default GPT_3_5_TURBO context window
|
Unfortunately I'm not quite sure why the tests failed, I'm completely new to this codebase. The outdated token limits sadly prevent any reasonable usage of the latest models which also come with a significant price cut. |
|
Hi @TheDome0 That test may need to get updated at https://github.com/xebia-functional/xef/blob/main/core/src/commonTest/kotlin/com/xebia/functional/xef/conversation/ConversationSpec.kt#L54 |
|
Well, shouldn't the check simply be the same as for the 16k model right below? Since now the default 3.5 model also has a 16k context length. |
Context window for new gpt-3.5-turbo model (Currently points to gpt-3.5-turbo-0125) has increased and is now on par with gpt-3.5-turbo-16k
Add GPT_3_5_TURBO_0125 and update default GPT_3_5_TURBO context window according to https://platform.openai.com/docs/models/gpt-3-5-turbo