Skip to content

Error when loading pretrained SCT #20

@glimmer16

Description

@glimmer16

Hi, I met this error when running test_video_frame.py:

Traceback (most recent call last):
  File "test_video_frame.py", line 120, in <module>
    SCT.load_state_dict(torch.load(args.SCT))
  File "D:\Anaconda\envs\CCPL\lib\site-packages\torch\nn\modules\module.py", line 1407, in load_state_dict
    self.__class__.__name__, "\n\t".join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for SCT:
        Unexpected key(s) in state_dict: "cnet.4.weight", "cnet.4.bias", "snet.4.weight", "snet.4.bias".
        size mismatch for cnet.0.weight: copying a param with shape torch.Size([256, 512, 1, 1]) from checkpoint, the shape in current model is torch.Size([128, 256, 1, 1]).
        size mismatch for cnet.0.bias: copying a param with shape torch.Size([256]) from checkpoint, the shape in current model is torch.Size([128]).    
        size mismatch for cnet.2.weight: copying a param with shape torch.Size([128, 256, 1, 1]) from checkpoint, the shape in current model is torch.Size([32, 128, 1, 1]).
        size mismatch for cnet.2.bias: copying a param with shape torch.Size([128]) from checkpoint, the shape in current model is torch.Size([32]).     
        size mismatch for snet.0.weight: copying a param with shape torch.Size([256, 512, 3, 3]) from checkpoint, the shape in current model is torch.Size([128, 256, 3, 3]).
        size mismatch for snet.0.bias: copying a param with shape torch.Size([256]) from checkpoint, the shape in current model is torch.Size([128]).    
        size mismatch for snet.2.weight: copying a param with shape torch.Size([128, 256, 3, 3]) from checkpoint, the shape in current model is torch.Size([32, 128, 1, 1]).
        size mismatch for snet.2.bias: copying a param with shape torch.Size([128]) from checkpoint, the shape in current model is torch.Size([32]).     
        size mismatch for uncompress.weight: copying a param with shape torch.Size([512, 32, 1, 1]) from checkpoint, the shape in current model is torch.Size([256, 32, 1, 1]).
        size mismatch for uncompress.bias: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([256]).

It seems like something went wrong when loading the pretrained SCT, so how can I fix it?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions