Skip to content

[Bug]: Getting "ValidationException: The provided model identifier is invalid." for Bedrock provide #515

@nisranjan

Description

@nisranjan

@augustintsang

Environment (please complete the following):

  • OS: [e.g. Ubuntu 22.04] GNU/Linux 5.15.164.4-microsoft-standard-WSL2
  • kubectl-ai version (run kubectl-ai version): [e.g. 0.3.0] 0.3.0
  • LLM provider: [e.g. gemini, openai, grok...] Amazon Bedrock
  • LLM model: [e.g. gemini-2.5-pro] Amazon Titan Text Express V1

Describe the bug
I am trying to solve for issues in bedrock and coming across an error. The specific error is

E0903 10:33:59.626219 1 conversation.go:472] "error sending streaming LLM response" err="bedrock stream error: operation error Bedrock Runtime: ConverseStream, https response error StatusCode: 400, RequestID: a9464e5c-4679-4f86-bf6d-46cf872c8bd3, ValidationException: The provided model identifier is invalid."

I am using LLM Model Amazon Titan Text Express V1 (since Claude was not available) and I have turned off Systems prompt in code as per this document but still getting the same error.

I have tried to use other models also, for e.g. Amazon Nova Pro. for also but here also I get the same error. Actually in this case when I try to go from AWS CLI, I get a different error

**C:\Users\DELL\OneDrive\Documents\kubectl-ai>aws bedrock-runtime converse --region ap-south-1 --model-id amazon.nova-pro-v1:0 --messages file://messages.json --profile nranjan

An error occurred (ValidationException) when calling the Converse operation: Invocation of model ID amazon.nova-pro-v1:0 with on-demand throughput isn’t supported. Retry your request with the ID or ARN of an inference profile that contains this model.**

I could not use Anthropic Claude Sonnet models, because I made some error while getting approval to use them from Amazon Marketplace and now I am not able to get that approval - at least I don't have it now.

To Reproduce
Steps to reproduce the behavior:

  1. Setup environment variables, the following
    AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY & AWS_REGION

  2. Go to my kubectl-ai project and build an image using
    docker build -f images/kubectl-ai/Dockerfile -t kubectl-ai:latest .

  3. Run command '...'
    docker run --rm -it -v ~/.kube:/root/.kube -v ~/home/ubuntu/.aws:/root/.aws -e AWS_ACCESS_KEY_ID -e AWS_SECRET_ACCESS_KEY -e AWS_REGION=us-east-1 -e BEDROCK
    _MODEL="amazon.titan-text-express-v1" kubectl-ai:v0.1 --llm-provider=bedrock "test"

  4. See errorExpected behavior
    E0903 10:33:37.477675 1 conversation.go:472] "error sending streaming LLM response" err="bedrock stream error: operation error Bedrock Runtime: ConverseStream, https response error StatusCode: 400, RequestID: ce65503d-1fdd-4d5b-b821-168c4f51207e, ValidationException: The provided model identifier is invalid."

Additional context
Add any other context or logs here.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions