Skip to content

Commit 26e5e6a

Browse files
isururanawakafacebook-github-bot
authored andcommitted
Adding a comment to clarify apply_optimizer_in_backward function call in sharding_single_rank_test_single_process (#3155)
Summary: Pull Request resolved: #3155 Reviewed By: aporialiao Differential Revision: D77699198 fbshipit-source-id: 8c8477b3b73730425d45ccdabb1d4282e29b9d02
1 parent bde9888 commit 26e5e6a

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

torchrec/distributed/test_utils/test_sharding.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -764,7 +764,7 @@ def sharding_single_rank_test_single_process(
764764

765765
global_model_named_params_as_dict = dict(global_model.named_parameters())
766766
local_model_named_params_as_dict = dict(local_model.named_parameters())
767-
767+
# Registers a hook to update parameters in the backward pass, when gradients are computed.
768768
if apply_optimizer_in_backward_config is not None:
769769
for apply_optim_name, (
770770
optimizer_type,

0 commit comments

Comments
 (0)