Skip to content

Commit 1a57ce1

Browse files
s4ayubfacebook-github-bot
authored andcommitted
Skip fsdp2 import if running with deploy (meta-pytorch#2483)
Summary: Pull Request resolved: meta-pytorch#2483 title, this breaks deploy models Reviewed By: sayitmemory Differential Revision: D64237929 fbshipit-source-id: 0cc908549c3cb7f6e66eb6d1ef0d0ccd3241d122
1 parent 3e58a31 commit 1a57ce1

File tree

1 file changed

+9
-1
lines changed
  • torchrec/distributed/train_pipeline

1 file changed

+9
-1
lines changed

torchrec/distributed/train_pipeline/utils.py

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,9 +31,17 @@
3131
Union,
3232
)
3333

34+
import torch
3435
from torch import distributed as dist
3536

36-
from torch.distributed._composable.fsdp.fully_shard import FSDPModule as FSDP2
37+
if not torch._running_with_deploy():
38+
from torch.distributed._composable.fsdp.fully_shard import FSDPModule as FSDP2
39+
else:
40+
41+
class FSDP2:
42+
pass
43+
44+
3745
from torch.distributed.fsdp import FullyShardedDataParallel as FSDP
3846
from torch.fx.immutable_collections import (
3947
immutable_dict as fx_immutable_dict,

0 commit comments

Comments
 (0)