Skip to content

Amplify Gen2 LLama integration error #3058

@bogris

Description

@bogris

Environment information

appy-contest git:(main) ✗ npx ampx info

System:
  OS: macOS 15.0
  CPU: (8) arm64 Apple M1 Pro
  Memory: 134.86 MB / 16.00 GB
  Shell: /bin/zsh
Binaries:
  Node: 20.10.0 - ~/.nvm/versions/node/v20.10.0/bin/node
  Yarn: 1.22.15 - ~/.npm-packages/bin/yarn
  npm: 10.9.0 - ~/.nvm/versions/node/v20.10.0/bin/npm
  pnpm: 6.11.0 - ~/.npm-packages/bin/pnpm
NPM Packages:
  @aws-amplify/auth-construct: 1.5.0
  @aws-amplify/backend: 1.8.0
  @aws-amplify/backend-auth: 1.4.1
  @aws-amplify/backend-cli: 1.4.2
  @aws-amplify/backend-data: 1.2.1
  @aws-amplify/backend-deployer: 1.1.9
  @aws-amplify/backend-function: 1.8.0
  @aws-amplify/backend-output-schemas: 1.4.0
  @aws-amplify/backend-output-storage: 1.1.3
  @aws-amplify/backend-secret: 1.1.4
  @aws-amplify/backend-storage: 1.2.3
  @aws-amplify/cli-core: 1.2.0
  @aws-amplify/client-config: 1.5.2
  @aws-amplify/deployed-backend-client: 1.4.2
  @aws-amplify/form-generator: 1.0.3
  @aws-amplify/model-generator: 1.0.9
  @aws-amplify/platform-core: 1.2.1
  @aws-amplify/plugin-types: 1.5.0
  @aws-amplify/sandbox: 1.2.6
  @aws-amplify/schema-generator: 1.2.5
  aws-amplify: 6.9.0
  aws-cdk: 2.163.1
  aws-cdk-lib: 2.163.1
  typescript: 5.6.3
AWS environment variables:
  AWS_STS_REGIONAL_ENDPOINTS = regional
  AWS_NODEJS_CONNECTION_REUSE_ENABLED = 1
  AWS_SDK_LOAD_CONFIG = 1
No CDK environment variables
npm notice
npm notice New patch version of npm available! 10.9.0 -> 10.9.1
npm notice Changelog: https://github.com/npm/cli/releases/tag/v10.9.1
npm notice To update run: npm install -g [email protected]
npm notice
npm notice
npm notice New patch version of npm available! 10.9.0 -> 10.9.1
npm notice Changelog: https://github.com/npm/cli/releases/tag/v10.9.1
npm notice To update run: npm install -g [email protected]
npm notice

Describe the bug

with this config for a generation in amplify data:

const aiCheckSong = a
  .generation({
    aiModel: a.ai.model("Llama 3.1 405B Instruct"),
    systemPrompt: `You are a helpful assistant that checks, corrects and returns full song names, full composers name  and types. 
      Based on the input, you should return the corrected song name, composer and type. 
      The song name and type should be included in response only if the input contains a song name, 
      Feel free to add the composer if it's not included in the input and you are confident you know it.`,
  })
  .arguments({
    inputName: a.string(),
    inputComposer: a.string(),
  })
  .returns(
    a.customType({
      songName: a.string(),
      songComposer: a.string(),
    })
  )
  .authorization((allow) => allow.publicApiKey());

call from the frontend into api.generation.aiCheckSong will throw an error:

[
{
  "path": [
    "aiCheckSong"
  ],
  "data": null,
  "errorType": "ValidationException:http://internal.amazon.com/coral/com.amazon.bedrock/",
  "errorInfo": null,
  "locations": [
    {
      "line": 2,
      "column": 3,
      "sourceName": null
    }
  ],
  "message": "A custom error was thrown from a mapping template."
}
]

the same set-up works with Antropic Claude 3.5 as the model, and I get the correct formatted object.

Looking at the docs I don't see any limitation of the Llama model.

Reproduction steps

deploy a data api with the setup from above

from FE call the api with:

   const res = await clientPublicApi.generations.aiCheckSong({
      inputName: name,
      inputComposer: composer,
    });
    ```
    both params are strings. 

Metadata

Metadata

Assignees

No one assigned

    Labels

    Gen 2aikitRelated to Amplify AI kitpending-community-responseIssue is pending a response from the author or community.questionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions