Skip to content

Commit 77f55b9

Browse files
authored
silence destroy_proces_group() warning (#1387)
1 parent 746c0a2 commit 77f55b9

File tree

3 files changed

+10
-0
lines changed

3 files changed

+10
-0
lines changed

distributed/tensor_parallelism/fsdp_tp_example.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -173,3 +173,6 @@
173173
rank_log(_rank, logger, f"2D iter {i} complete")
174174

175175
rank_log(_rank, logger, "2D training successfully completed!")
176+
177+
if dist.is_initialized():
178+
dist.destroy_process_group()

distributed/tensor_parallelism/sequence_parallel_example.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -22,6 +22,7 @@
2222
import torch
2323
import torch.nn as nn
2424

25+
import torch.distributed as dist
2526
from torch.distributed._tensor import Shard
2627

2728
from torch.distributed.tensor.parallel import (
@@ -107,3 +108,6 @@ def forward(self, x):
107108
rank_log(_rank, logger, f"Sequence Parallel iter {i} completed")
108109

109110
rank_log(_rank, logger, "Sequence Parallel training completed!")
111+
112+
if dist.is_initialized():
113+
dist.destroy_process_group()

distributed/tensor_parallelism/tensor_parallel_example.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -122,3 +122,6 @@ def forward(self, x):
122122
rank_log(_rank, logger, f"Tensor Parallel iter {i} completed")
123123

124124
rank_log(_rank, logger, "Tensor Parallel training completed!")
125+
126+
if dist.is_initialized():
127+
dist.destroy_process_group()

0 commit comments

Comments
 (0)