-
Notifications
You must be signed in to change notification settings - Fork 7.9k
Closed
Labels
supportCommunity support about how to use the projectCommunity support about how to use the project
Description
I define a class function unfreeze_box_head as below.
@META_ARCH_REGISTRY.register()
class GeneralizedRCNN(nn.Module):
def __init__(self, cfg):
super().__init__()
...
...
def unfreeze_box_head(self):
for p in self.roi_heads.box_head.parameters():
p.requires_grad = True
logger.info('Unfreeze roi_box_head parameters')
And I call it with
if iteration == 500:
model.module.unfreeze_box_head()
I expect that it can unfreeze the weights of box head module at the iteration of 500. However, as I observe the reaction of the loss, it seems that it does not take effect. I wonder whether there is any trick to pay attention to, something that may be specific to the distributed dataparallel framework. Any advice is appreciated. Thanks.
Metadata
Metadata
Assignees
Labels
supportCommunity support about how to use the projectCommunity support about how to use the project