Skip to content

feat: add temperature and other common settings to model-level configuration#388

Open
robert-j-y wants to merge 1 commit intomainfrom
devin/1769809800-issue-387-temperature-settings
Open

feat: add temperature and other common settings to model-level configuration#388
robert-j-y wants to merge 1 commit intomainfrom
devin/1769809800-issue-387-temperature-settings

Conversation

@robert-j-y
Copy link
Contributor

Description

Fixes #387

Adds support for setting default values for common model parameters at model creation time. Previously, settings like temperature could only be passed at call time via generateText/streamText. Now users can set defaults when creating the model:

// Before: TypeScript error - temperature not allowed
const model = openrouter('google/gemini-3-flash-preview', {
  temperature: 0 // ❌ Error
});

// After: Works correctly
const model = openrouter('google/gemini-3-flash-preview', {
  temperature: 0,
  topP: 0.9,
  maxTokens: 100,
});

Call-level options still override model-level settings when provided.

New settings added to OpenRouterSharedSettings:

  • temperature - Controls randomness (0-2)
  • topP - Nucleus sampling (0-1)
  • topK - Top-k sampling
  • frequencyPenalty - Penalize repeated tokens (-2 to 2)
  • presencePenalty - Penalize tokens based on presence (-2 to 2)
  • maxTokens - Maximum tokens to generate

Key areas for review:

Checklist

  • I have run pnpm stylecheck and pnpm typecheck
  • I have run pnpm test and all tests pass
  • I have added tests for my changes (if applicable)
  • I have updated documentation (if applicable)

Changeset

  • I have run pnpm changeset to create a changeset file

Link to Devin run: https://app.devin.ai/sessions/ad14476ac9334b8a962f771f56e48975
Requested by: Robert Yeakel (@robert-j-y)

…uration

Fixes #387

This change adds support for setting default values for common model parameters
at the model creation level:
- temperature
- topP
- topK
- frequencyPenalty
- presencePenalty
- maxTokens

These settings can be overridden at call time via generateText/streamText options.

Example:
```typescript
const model = openrouter('google/gemini-3-flash-preview', {
  temperature: 0,
  topP: 0.9,
  maxTokens: 100,
});
```

Co-Authored-By: Robert Yeakel <robert.yeakel@openrouter.ai>
@seannetlife
Copy link

ah, this is gonna be awesome to have. ty!

Copy link
Contributor

@louisgv louisgv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Support temperature settings in OpenRouterChatSettings. (Missing)

3 participants