Skip to content

[https://nvbugs/5510879][fix] Fix pytorch & TRT-python flows fused LoRA adapter modules weight split with TP>1 #66822

[https://nvbugs/5510879][fix] Fix pytorch & TRT-python flows fused LoRA adapter modules weight split with TP>1

[https://nvbugs/5510879][fix] Fix pytorch & TRT-python flows fused LoRA adapter modules weight split with TP>1 #66822

Triggered via issue October 12, 2025 19:29
Status Skipped
Total duration 1s
Artifacts

bot-command.yml

on: issue_comment
Bot command check
Bot command check
Fit to window
Zoom out
Zoom in