Configuring Anthropic models from Bedrock using custom endpoint #9175
keepthegoal
started this conversation in
Help Wanted
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone,
I'm trying to simulate request to Anthropic models from Bedrock via enterprise proxy wuth custom endpoint:
Working with librechat:0.7.9 image
Unfortunately, I can not use pre configured provider as described here:
https://www.librechat.ai/docs/configuration/pre_configured_ai/bedrock
So the only option is using custom endpoint
Here's my custom endpoint which I try to set up:
With this configutation, I see the following url in request:
https://bedrock-gateway-beta.apps.ai-dev.abc.cloud/invoke/chat/completions
And getting 404 error with this link:
https://js.langchain.com/docs/troubleshooting/errors/MODEL_NOT_FOUND/
Beta Was this translation helpful? Give feedback.
All reactions