Skip to content

Commit 47fccde

Browse files
committed
adding links to default inference configuration
1 parent ca8086e commit 47fccde

File tree

1 file changed

+11
-6
lines changed
  • src/pages/[platform]/ai/concepts/inference-configuration

1 file changed

+11
-6
lines changed

src/pages/[platform]/ai/concepts/inference-configuration/index.mdx

Lines changed: 11 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -64,7 +64,9 @@ a.generation({
6464

6565
### Temperature
6666

67-
Affects the shape of the probability distribution for the predicted output and influences the likelihood of the model selecting lower-probability outputs. Temperature is a number from 0 to 1, where a lower value will influence the model to select higher-probability options. Another way to think about temperature is to think about creativity. A low number (close to zero) would produce the least creative and most deterministic response.
67+
Affects the shape of the probability distribution for the predicted output and influences the likelihood of the model selecting lower-probability outputs. Temperature is usually* number from 0 to 1, where a lower value will influence the model to select higher-probability options. Another way to think about temperature is to think about creativity. A low number (close to zero) would produce the least creative and most deterministic response.
68+
69+
-* AI21 Labs Jamba models use a temperature range of 0 – 2.0
6870

6971
### Top P
7072

@@ -81,10 +83,13 @@ This parameter is used to limit the maximum response a model can give.
8183

8284
| Model | Temperature | Top P | Max Tokens |
8385
| ----- | ----------- | ----- | ---------- |
84-
| Meta Llama | 0.5 | 0.9 | 512 |
85-
| Amazon Titan | 0.7 | 0.9 | 512 |
86-
| Anthropic Claude | 1 | 0.999 | 512 |
87-
| Cohere Command R | 0.3 | 0.75 | 512 |
88-
| Mistral Large | 0.7 | 1 | 8192 |
86+
| [AI21 Labs Jamba](https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters-jamba.html#model-parameters-jamba-request-response) | 1.0* | 0.5 | 4096 |
87+
| [Meta Llama](https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters-meta.html#model-parameters-meta-request-response) | 0.5 | 0.9 | 512 |
88+
| [Amazon Titan](https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters-titan-text.html) | 0.7 | 0.9 | 512 |
89+
| [Anthropic Claude](https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters-anthropic-claude-messages.html#model-parameters-anthropic-claude-messages-request-response) | 1 | 0.999 | 512 |
90+
| [Cohere Command R](https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters-cohere-command-r-plus.html#model-parameters-cohere-command-request-response) | 0.3 | 0.75 | 512 |
91+
| [Mistral Large](https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters-mistral-chat-completion.html#model-parameters-mistral-chat-completion-request-response) | 0.7 | 1 | 8192 |
8992

9093
[Bedrock documentation on model default inference configuration](https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters.html)
94+
95+
-* AI21 Labs Jamba models use a temperature range of 0 – 2.0

0 commit comments

Comments
 (0)