Skip to content

[AutoDeploy][Feature]: Basic LorA Support #8741

@govind-ramnarayan

Description

@govind-ramnarayan

🚀 The feature, motivation and pitch

Support LorA in AutoDeploy. Minimum requirements are:

  • Single LorA adapter
  • No sharding.
  • No performance tuning.

Alternatives

No response

Additional context

No response

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and checked the documentation and examples for answers to frequently asked questions.

Metadata

Metadata

Labels

Lora/P-tuningParameter-Efficient Fine-Tuning (PEFT) like LoRA/P-tuning in TRTLLM: adapter use & perf.feature requestNew feature or request. This includes new model, dtype, functionality support

Type

No type

Projects

Status

Backlog

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions