Unable to configure conversation parameters, error on token limit #4716
Unanswered
leoneperdigao
asked this question in
Troubleshooting
Replies: 1 comment 5 replies
-
Hi @leoneperdigao. I took a quicky look on bedrock.js file from langchain node module and seams that max_tokens wasn't set, maybe because the provider used isn't on list of suported (anthropic, ai21, meta, amazon, cohere, mistral). May be this explain the "undefined". |
Beta Was this translation helpful? Give feedback.
5 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
I'm facing an error when I try to interact with a model (via AWS Bedrock). The model is made available via a custom inference profile, just due to region restrictions. However, I have this strange error:
Also, I can't set the parameters on the right side panel:
I did check the documentation and didn't see anything specific about it. I wonder whether it is a misconfiguration on my side.
Steps to Reproduce
Just start a new conversation and enter any prompt.
What browsers are you seeing the problem on?
Chrome
Relevant log output
No response
Screenshots
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions