Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ All generative AI routes in Amplify accept inference configuration as optional p

```ts
a.generation({
aiModel: a.ai.model("Claude 3 Haiku"),
aiModel: a.ai.model("Claude 3.5 Haiku"),
systemPrompt: `You are a helpful assistant`,
inferenceConfiguration: {
temperature: 0.2,
Expand Down
19 changes: 12 additions & 7 deletions src/pages/[platform]/ai/concepts/models/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -58,27 +58,32 @@ Always refer to [Bedrock pricing](https://aws.amazon.com/bedrock/pricing/) for t
The Amplify AI Kit uses Bedrock's [Converse API](https://docs.aws.amazon.com/bedrock/latest/userguide/conversation-inference.html) to leverage a unified API across models. Most models have different structures to how they best work with input and how they format their output. For example, ...

### AI21 Labs
* [Jamba 1.5 Large](https://aws.amazon.com/blogs/aws/jamba-1-5-family-of-models-by-ai21-labs-is-now-available-in-amazon-bedrock/)
* [Jamba 1.5 Mini](https://aws.amazon.com/blogs/aws/jamba-1-5-family-of-models-by-ai21-labs-is-now-available-in-amazon-bedrock/)

* Jamba 1.5 Large
* Jamba 1.5 Mini
[Bedrock documentation about AI21 models](https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters-ai21.html)

### Anthropic
* Claude 3 Haiku
* Claude 3.5 Haiku
* Claude 3 Sonnet
* Claude 3 Opus
* Claude 3.5 Sonnet
https://docs.anthropic.com/en/docs/about-claude/models
* Claude 3.5 Sonnet v2
[Bedrock documentation about Anthropic models](https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters-claude.html)

### Cohere
* Command R
* Command R+
[Bedrock documentation about Cohere models](https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters-cohere.html)

### Meta
### Meta Llama
* Llama 3.1
[Bedrock documentation about Meta Llama models](https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters-meta.html)

### Mistral
### Mistral AI
* Large
* Large 2
[Bedrock documentation about Mistral AI models](https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters-mistral.html)


The Amplify AI Kit makes use of ["tools"](/[platform]/ai/concepts/tools) for both generation and conversation routes. [The models it supports must support tool use in the Converse API](https://docs.aws.amazon.com/bedrock/latest/userguide/conversation-inference-supported-models-features.html).
Expand Down Expand Up @@ -115,7 +120,7 @@ Using the Amplify AI Kit you can easily use different models for different funct
```ts
const schema = a.schema({
summarizer: a.generation({
aiModel: a.ai.model("Claude 3 Haiku")
aiModel: a.ai.model("Claude 3.5 Haiku")
})
})
```
Expand Down
2 changes: 1 addition & 1 deletion src/pages/[platform]/ai/concepts/prompting/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ All AI routes in the Amplify AI kit require a system prompt. This will be used i

```ts
reviewSummarizer: a.generation({
aiModel: a.ai.model("Claude 3.5 Sonnet"),
aiModel: a.ai.model("Claude 3.5 Haiku"),
systemPrompt: `
You are a helpful assistant that summarizes reviews
for an ecommerce site.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ const schema = a.schema({
// highlight-end

chat: a.conversation({
aiModel: a.ai.model("Claude 3.5 Sonnet"),
aiModel: a.ai.model("Claude 3.5 Haiku"),
systemPrompt: `You are a helpful assistant.`,
// highlight-start
tools: [
Expand Down
8 changes: 4 additions & 4 deletions src/pages/[platform]/ai/conversation/tools/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ const schema = a.schema({
.authorization(allow => allow.owner()),

chat: a.conversation({
aiModel: a.ai.model('Claude 3 Haiku'),
aiModel: a.ai.model('Claude 3.5 Haiku'),
systemPrompt: 'Hello, world!',
tools: [
a.ai.dataTool({
Expand Down Expand Up @@ -126,7 +126,7 @@ const schema = a.schema({
.authorization((allow) => allow.authenticated()),

chat: a.conversation({
aiModel: a.ai.model('Claude 3 Haiku'),
aiModel: a.ai.model('Claude 3.5 Haiku'),
systemPrompt: 'You are a helpful assistant',
tools: [
a.ai.dataTool({
Expand Down Expand Up @@ -235,13 +235,13 @@ export const chatHandler = defineConversationHandlerFunction({
entry: './chatHandler.ts',
name: 'customChatHandler',
models: [
{ modelId: a.ai.model("Claude 3 Haiku") }
{ modelId: a.ai.model("Claude 3.5 Haiku") }
]
});

const schema = a.schema({
chat: a.conversation({
aiModel: a.ai.model('Claude 3 Haiku'),
aiModel: a.ai.model('Claude 3.5 Haiku'),
systemPrompt: "You are a helpful assistant",
handler: chatHandler,
})
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ const schema = a.schema({
}),

extractProductDetails: a.generation({
aiModel: a.ai.model('Claude 3 Haiku'),
aiModel: a.ai.model('Claude 3.5 Haiku'),
systemPrompt: 'Extract the property details from the text provided',
})
.arguments({
Expand Down
12 changes: 6 additions & 6 deletions src/pages/[platform]/ai/generation/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ Under the hood, a generation route is an AWS AppSync query that ensures the AI m
```ts title="Schema Definition"
const schema = a.schema({
generateRecipe: a.generation({
aiModel: a.ai.model('Claude 3 Haiku'),
aiModel: a.ai.model('Claude 3.5 Haiku'),
systemPrompt: 'You are a helpful assistant that generates recipes.',
})
.arguments({ description: a.string() })
Expand Down Expand Up @@ -117,7 +117,7 @@ export default function Example() {
```ts title="Schema Definition"
const schema = ({
summarize: a.generation({
aiModel: a.ai.model('Claude 3 Haiku'),
aiModel: a.ai.model('Claude 3.5 Haiku'),
systemPrompt: 'Provide an accurate, clear, and concise summary of the input provided'
})
.arguments({ input: a.string() })
Expand All @@ -141,7 +141,7 @@ This ability to control the randomness and diversity of responses is useful for
```ts title="Inference Parameters"
const schema = a.schema({
generateHaiku: a.generation({
aiModel: a.ai.model('Claude 3 Haiku'),
aiModel: a.ai.model('Claude 3.5 Haiku'),
systemPrompt: 'You are a helpful assistant that generates haikus.',
// highlight-start
inferenceConfiguration: {
Expand All @@ -168,7 +168,7 @@ const schema = a.schema({
instructions: a.string(),
}),
generateRecipe: a.generation({
aiModel: a.ai.model('Claude 3 Haiku'),
aiModel: a.ai.model('Claude 3.5 Haiku'),
systemPrompt: 'You are a helpful assistant that generates recipes.',
})
.arguments({ description: a.string() })
Expand All @@ -187,7 +187,7 @@ const schema = a.schema({
instructions: a.string(),
}),
generateRecipe: a.generation({
aiModel: a.ai.model('Claude 3 Haiku'),
aiModel: a.ai.model('Claude 3.5 Haiku'),
systemPrompt: 'You are a helpful assistant that generates recipes.',
})
.arguments({ description: a.string() })
Expand All @@ -211,7 +211,7 @@ The following AppSync scalar types are not supported as **required** fields in r
```ts title="Unsupported Required Type"
const schema = a.schema({
generateUser: a.generation({
aiModel: a.ai.model('Claude 3 Haiku'),
aiModel: a.ai.model('Claude 3.5 Haiku'),
systemPrompt: 'You are a helpful assistant that generates users.',
})
.arguments({ description: a.string() })
Expand Down
4 changes: 2 additions & 2 deletions src/pages/[platform]/ai/set-up-ai/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ const schema = a.schema({
// This will add a new conversation route to your Amplify Data backend.
// highlight-start
chat: a.conversation({
aiModel: a.ai.model('Claude 3 Haiku'),
aiModel: a.ai.model('Claude 3.5 Haiku'),
systemPrompt: 'You are a helpful assistant',
})
.authorization((allow) => allow.owner()),
Expand All @@ -95,7 +95,7 @@ const schema = a.schema({
// This adds a new generation route to your Amplify Data backend.
// highlight-start
generateRecipe: a.generation({
aiModel: a.ai.model('Claude 3 Haiku'),
aiModel: a.ai.model('Claude 3.5 Haiku'),
systemPrompt: 'You are a helpful assistant that generates recipes.',
})
.arguments({
Expand Down