Skip to content

How to unfreeze some weight parameters at certain iteration during training? #2486

@Ze-Yang

Description

@Ze-Yang

I define a class function unfreeze_box_head as below.

@META_ARCH_REGISTRY.register()
class GeneralizedRCNN(nn.Module):
    def __init__(self, cfg):
        super().__init__()
        ...
        ...

    def unfreeze_box_head(self):
        for p in self.roi_heads.box_head.parameters():
            p.requires_grad = True
        logger.info('Unfreeze roi_box_head parameters')

And I call it with

if iteration == 500:
    model.module.unfreeze_box_head()

I expect that it can unfreeze the weights of box head module at the iteration of 500. However, as I observe the reaction of the loss, it seems that it does not take effect. I wonder whether there is any trick to pay attention to, something that may be specific to the distributed dataparallel framework. Any advice is appreciated. Thanks.

Metadata

Metadata

Assignees

No one assigned

    Labels

    supportCommunity support about how to use the project

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions