Commit 270a90c
committed
Use max_completion_tokens for OpenAI models on OCI GenAI
OpenAI models served through OCI GenAI reject the max_tokens
parameter and require max_completion_tokens instead. Detect
provider=openai in the oci_langchain wrapper and use the correct
key in model_kwargs.1 parent 1ea626a commit 270a90c
File tree
2 files changed
+28
-1
lines changed- packages/nvidia_nat_langchain
- src/nat/plugins/langchain
- tests
2 files changed
+28
-1
lines changedLines changed: 4 additions & 1 deletion
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
244 | 244 | | |
245 | 245 | | |
246 | 246 | | |
247 | | - | |
| 247 | + | |
| 248 | + | |
| 249 | + | |
| 250 | + | |
248 | 251 | | |
249 | 252 | | |
250 | 253 | | |
| |||
Lines changed: 24 additions & 0 deletions
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
214 | 214 | | |
215 | 215 | | |
216 | 216 | | |
| 217 | + | |
| 218 | + | |
| 219 | + | |
| 220 | + | |
| 221 | + | |
| 222 | + | |
| 223 | + | |
| 224 | + | |
| 225 | + | |
| 226 | + | |
| 227 | + | |
| 228 | + | |
| 229 | + | |
| 230 | + | |
| 231 | + | |
| 232 | + | |
| 233 | + | |
| 234 | + | |
| 235 | + | |
| 236 | + | |
| 237 | + | |
| 238 | + | |
| 239 | + | |
| 240 | + | |
217 | 241 | | |
218 | 242 | | |
219 | 243 | | |
| |||
0 commit comments