TurboMind 如何支持lora部署?未来我们会有计划去支持吗?? #3710
Unanswered
akai-shuuichi
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
最近在使用lora微调模型,使用vllm发现性能不及预期(多lora最快的应该就是这个了),但lmdeploy只有torch支持多lora,性能也不达预期,未来团队是否会有计划支持TurboMind多lora部署呢?
Beta Was this translation helpful? Give feedback.
All reactions