Skip to content

Commit ebd6aec

Browse files
author
Roo Code
committed
fix: resolve TypeScript error in BaseOpenAiCompatibleProvider
- Add proper type checking for modelMaxTokens to handle nullish values - Ensure Math.min only receives number types
1 parent 740c580 commit ebd6aec

File tree

1 file changed

+4
-3
lines changed

1 file changed

+4
-3
lines changed

src/api/providers/base-openai-compatible-provider.ts

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -73,12 +73,13 @@ export abstract class BaseOpenAiCompatibleProvider<ModelName extends string>
7373
} = this.getModel()
7474

7575
const temperature = this.options.modelTemperature ?? this.defaultTemperature
76-
7776
// Ensure max_tokens doesn't exceed the model's configured limit
7877
// Users can override with modelMaxTokens, but it should not exceed the model's actual API limit
7978
const userMaxTokens = this.options.modelMaxTokens
80-
const max_tokens = userMaxTokens ? Math.min(userMaxTokens, modelMaxTokens) : modelMaxTokens
81-
79+
const max_tokens =
80+
typeof userMaxTokens === "number" && userMaxTokens > 0 && typeof modelMaxTokens === "number"
81+
? Math.min(userMaxTokens, modelMaxTokens)
82+
: modelMaxTokens
8283
const params: OpenAI.Chat.Completions.ChatCompletionCreateParamsStreaming = {
8384
model,
8485
max_tokens,

0 commit comments

Comments
 (0)