I imported qwen2.5 to Bedrock, but when I try to use for autocomplete or chat I get This action doesn't support the model that you provided. Try again with a supported text or chat model.
#9192
Unanswered
robinclark
asked this question in
Help
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I imported qwen2.5 to Bedrock, but when I try to use for autocomplete or chat I get
This action doesn't support the model that you provided. Try again with a supported text or chat model.I'm able to query this model via aws sdk. Here's my config.yaml entry:According to this issue: #6166, just using provider
bedrockis correct as opposed tobedrockimport.Side note, I also tried the
bedrockimportprovider, and when I did I got this error:[@continuedev] error: No value provided for input HTTP label: modelId. {"context":"llm_stream_complete","model":"arn:aws:bedrock:us-east-1:[aws-acct]:imported-model/d5ver0avdfiz","provider":"bedrockimport","useOpenAIAdapter":false,"streamEnabled":true}The config for that:
Any ideas?
Beta Was this translation helpful? Give feedback.
All reactions