Skip to content

Commit c7378ed

Browse files
Fix, doc
1 parent 94c669c commit c7378ed

File tree

2 files changed

+4
-1
lines changed

2 files changed

+4
-1
lines changed

tests/pipelines/run_compiled_model_hotswap.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -123,7 +123,7 @@ def check_hotswap(do_hotswap):
123123
unet = get_small_unet()
124124
file_name = os.path.join(tmp_dirname, "pytorch_lora_weights.safetensors")
125125
unet.load_attn_procs(file_name)
126-
# unet = torch.compile(unet, mode="reduce-overhead")
126+
unet = torch.compile(unet, mode="reduce-overhead")
127127

128128
torch.manual_seed(42)
129129
out0 = unet(**dummy_input)["sample"]

tests/pipelines/test_pipelines.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2070,6 +2070,9 @@ class TestLoraHotSwapping:
20702070
tested there. The goal of this test is specifically to ensure that hotswapping with diffusers does not require
20712071
recompilation.
20722072
2073+
The reason why we need to shell out instead of just running the script inside of the test is that shelling out is
2074+
required to collect the torch.compile logs.
2075+
20732076
"""
20742077

20752078
@slow

0 commit comments

Comments
 (0)