Commit 73e8491
committed
🐛 Fix OpenAI thinking model token parameter handling
Add special handling for OpenAI thinking models to use correct token
parameters
- Add is_openai_thinking_model() helper function to detect thinking models
- Skip setting max_tokens via builder for OpenAI thinking models
- Convert max_tokens to max_completion_tokens in get_combined_config()
- Ensure proper parameter handling for models starting with 'o'
OpenAI thinking models require max_completion_tokens instead of max_tokens
parameter, which was causing configuration issues.1 parent 0a202d6 commit 73e8491
1 file changed
+23
-1
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
61 | 61 | | |
62 | 62 | | |
63 | 63 | | |
64 | | - | |
| 64 | + | |
| 65 | + | |
| 66 | + | |
| 67 | + | |
| 68 | + | |
| 69 | + | |
65 | 70 | | |
66 | 71 | | |
67 | 72 | | |
| |||
275 | 280 | | |
276 | 281 | | |
277 | 282 | | |
| 283 | + | |
| 284 | + | |
| 285 | + | |
| 286 | + | |
| 287 | + | |
| 288 | + | |
278 | 289 | | |
279 | 290 | | |
280 | 291 | | |
| |||
324 | 335 | | |
325 | 336 | | |
326 | 337 | | |
| 338 | + | |
| 339 | + | |
| 340 | + | |
| 341 | + | |
| 342 | + | |
| 343 | + | |
| 344 | + | |
| 345 | + | |
| 346 | + | |
| 347 | + | |
| 348 | + | |
327 | 349 | | |
328 | 350 | | |
329 | 351 | | |
| |||
0 commit comments