Skip to content

Commit 886259b

Browse files
authored
Specify out-of-range handling for params
Closes #53.
1 parent 28733f4 commit 886259b

File tree

1 file changed

+17
-11
lines changed

1 file changed

+17
-11
lines changed

README.md

Lines changed: 17 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -175,7 +175,7 @@ We'll likely explore more specific APIs for tool- and function-calling in the fu
175175

176176
### Configuration of per-session parameters
177177

178-
In addition to the `systemPrompt` and `initialPrompts` options shown above, the currently-configurable model parameters are [temperature](https://huggingface.co/blog/how-to-generate#sampling) and [top-K](https://huggingface.co/blog/how-to-generate#top-k-sampling). The `params()` API gives the default, minimum, and maximum values for these parameters.
178+
In addition to the `systemPrompt` and `initialPrompts` options shown above, the currently-configurable model parameters are [temperature](https://huggingface.co/blog/how-to-generate#sampling) and [top-K](https://huggingface.co/blog/how-to-generate#top-k-sampling). The `params()` API gives the default and maximum values for these parameters.
179179

180180
_However, see [issue #42](https://github.com/webmachinelearning/prompt-api/issues/42): sampling hyperparameters are not universal among models._
181181

@@ -186,19 +186,22 @@ const customSession = await ai.languageModel.create({
186186
});
187187

188188
const params = await ai.languageModel.params();
189-
const slightlyHighTemperatureSession = await ai.languageModel.create({
190-
temperature: Math.max(
191-
params.defaultTemperature * 1.2,
192-
params.maxTemperature
193-
),
194-
topK: 10
189+
const conditionalSession = await ai.languageModel.create({
190+
temperature: isCreativeTask ? params.defaultTemperature * 1.1 : params.defaultTemperature * 0.8,
191+
topK: isGeneratingIdeas ? params.maxTopK : params.defaultTopK
195192
});
196-
197-
// params also contains defaultTopK and maxTopK.
198193
```
199194

200195
If the language model is not available at all in this browser, `params()` will fulfill with `null`.
201196

197+
Error-handling behavior:
198+
199+
* If values below 0 are passed for `temperature`, then `create()` will return a promise rejected with a `RangeError`.
200+
* If values above `maxTemperature` are passed for `temperature`, then `create()` will clamp to `maxTemperature`. (`+Infinity` is specifically allowed, as a way of requesting maximum temperature.)
201+
* If values below 1 are passed for `topK`, then `create()` will return a promise rejected with a `RangeError`.
202+
* If values above `maxTopK` are passed for `topK`, then `create()` will clamp to `maxTopK`. (This includes `+Infinity` and numbers above `Number.MAX_SAFE_INTEGER`.)
203+
* If fractional values are passed for `topK`, they are rounded down (using the usual [IntegerPart](https://webidl.spec.whatwg.org/#abstract-opdef-integerpart) algorithm for web specs).
204+
202205
### Session persistence and cloning
203206

204207
Each language model session consists of a persistent series of interactions with the model:
@@ -472,8 +475,11 @@ interface AILanguageModelParams {
472475
};
473476
474477
dictionary AILanguageModelCreateCoreOptions {
475-
[EnforceRange] unsigned long topK;
476-
float temperature;
478+
// Note: these two have custom out-of-range handling behavior, not in the IDL layer.
479+
// They are unrestricted double so as to allow +Infinity without failing.
480+
unrestricted double topK;
481+
unrestricted double temperature;
482+
477483
sequence<DOMString> expectedInputLanguages;
478484
}
479485

0 commit comments

Comments
 (0)