Custom Fint-tuned models on AWS to spacy-llm #347
Replies: 1 comment
-
Hi @innocent-charles, if you want to connect to a provider that is not natively supported yet, you have to options:
In the case of models deployed on AWS, we currently have a user-contributed PR for the native support for Amazon Bedrock models in the works that might be useful for you. Meanwhile you can use the The task is orthogonal to the model. I. e. you can use any task with any model, and you can also combine a custom task with a custom model. It's not necessary however to write a custom task just because you use a custom model. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello ! @rmitsch and others, I was asking if it is possible to write a custom task with spacy llm that integrate custom fine tuned model deployed on the AWS sagemaker or EC2, or even AWS Lambda....
Beta Was this translation helpful? Give feedback.
All reactions