Skip to content

Commit 63126cc

Browse files
committed
Remove a buggy/redundant reset
1 parent 1965ffa commit 63126cc

File tree

1 file changed

+0
-1
lines changed

1 file changed

+0
-1
lines changed

megatron/core/full_cuda_graph.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -189,7 +189,6 @@ def __call__(self, *args, **kwargs):
189189
torch.cuda.synchronize()
190190
torch.distributed.barrier()
191191
logger.info(f'CUDA graph capture done!!!')
192-
paged_stash_reset(enabled=self.moe_paged_stash and training)
193192
if FullCudaGraphWrapper.cuda_graph[training_str] is None:
194193
FullCudaGraphWrapper.result[training_str] = self.forward_backward_func(*args, **kwargs)
195194
else:

0 commit comments

Comments
 (0)