Skip to content

[Feature]: Add TRTLLM as an attention backend for Auto DeployΒ #10243

@bmarimuthu-nv

Description

@bmarimuthu-nv

πŸš€ The feature, motivation and pitch

TRTLLM attention backend gives better perf for some models. Until flashInfer catches up, it'll be good to use TRTLLM backend to have on-par perf. So we need to add this backend option.

Alternatives

No response

Additional context

No response

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and checked the documentation and examples for answers to frequently asked questions.

Metadata

Metadata

Assignees

Labels

AutoDeploy<NV> AutoDeploy Backendfeature requestNew feature or request. This includes new model, dtype, functionality support

Type

No type

Projects

Status

Ready

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions