Skip to content

Feature request - custom LLM inference endpoint or AWS Bedrock endpoint #3

@sfc-gh-pmanowiecki

Description

@sfc-gh-pmanowiecki

Would love to see option to configure custom inference endpoint (not just OpenAI/Anthropic/Google/Moonshot API's)
Or to add AWS Bedrock endpoint as a third option.
For enterprise customers it's usually must-have.
I know it's maybe for startups ...
Maybe it's possible to "manually" edit that within config files?

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions